not work with vllm
#3
by
birolkuyumcu
- opened
vllm rises error
ValueError: Unrecognized configuration class <class 'transformers_modules.ZJU_hyphen_AI4H.Hulu_hyphen_Med_hyphen_7B.4e39caa035d4c95a7ac70afeadc258f67de4853f.configuration_hulumed_qwen2.HulumedQwen2Config'> for this kind of AutoModel: AutoModel.
VLLM integration is currently in progress, please stay tuned for updates!
Hello! We support vLLM now, please install our modified vLLM and try again~~