Conversion of Qwen3-8B into the AWQ format and finally the Ctranslate2 format.

Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctranslate2-4you/Qwen3-8B-ct2-AWQ

Base model

Qwen/Qwen3-8B-Base
Finetuned
Qwen/Qwen3-8B
Quantized
(215)
this model

Collection including ctranslate2-4you/Qwen3-8B-ct2-AWQ