Transformers
Safetensors
English
mismayil's picture
Update README.md
756a47e verified
metadata
library_name: transformers
license: mit
datasets:
  - CNCL-Penn-State/MuCE-Pref
language:
  - en
base_model:
  - CNCL-Penn-State/CrPO-sft-llama-3.1-8b-instruct

CrPO-SFT-Llama-3.1-8B-Instruct-div

This is a CrPO-sft-llama-3.1-8b-instruct model preference-finetuned on the MuCE-Pref dataset from the Creative Preference Optimization paper. This model is optimized for high output diversity.

Citation

@misc{ismayilzada2025creativepreferenceoptimization,
      title={Creative Preference Optimization}, 
      author={Mete Ismayilzada and Antonio Laverghetta Jr. and Simone A. Luchini and Reet Patel and Antoine Bosselut and Lonneke van der Plas and Roger E. Beaty},
      year={2025},
      eprint={2505.14442},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.14442}, 
}