License: Must comply with license of Llama2 since it's a model derived from Llama2.


Pruned-LLaMA-1.3B is a model pruned and further pre-trained from meta-llama/Llama-2-7b-hf.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("minhchuxuan/pruned-1.3b")
tokenizer = AutoTokenizer.from_pretrained("minhchuxuan/pruned-1.3b")
Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for minhchuxuan/pruned-1.3b

Quantizations
1 model