Conversion of Qwen3-32B into the AWQ format and finally the Ctranslate2 format.

Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctranslate2-4you/Qwen3-32B-ct2-AWQ

Base model

Qwen/Qwen3-32B
Quantized
(135)
this model

Collection including ctranslate2-4you/Qwen3-32B-ct2-AWQ