How to run it locally?
#6
by
skychen
- opened
I'm trying to run it locally, but not work with this:
model_id = "nvidia/audio-flamingo-3"
processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.float16,
device_map="auto",
trust_remote_code=True,
)
Got error like this:
lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1002, in __getitem__
raise KeyError(key)
KeyError: 'llava_llama'
I've updated/reinstall transformers to the latest version, but still not work.
Any suggestions?
same mistake here