Datasets:

Languages:
English
ArXiv:
License:
benjamin commited on
Commit
5ce37de
·
verified ·
1 Parent(s): 2c08e31

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -1
README.md CHANGED
@@ -77,4 +77,15 @@ Bolmo models are trained for less than one epoch (~39.3B tokens) on this mix.
77
  Bolmo Mix is licensed under the Open Data Commons Attribution License v1.0 (ODC-By). It is intended for research and educational use. For more information, please see our [Responsible Use Guidelines](https://allenai.org/responsible-use).
78
 
79
  ## Citation
80
- Forthcoming!
 
 
 
 
 
 
 
 
 
 
 
 
77
  Bolmo Mix is licensed under the Open Data Commons Attribution License v1.0 (ODC-By). It is intended for research and educational use. For more information, please see our [Responsible Use Guidelines](https://allenai.org/responsible-use).
78
 
79
  ## Citation
80
+
81
+ ```bibtex
82
+ @misc{bolmo,
83
+ title={Bolmo: Byteifying the Next Generation of Language Models},
84
+ author={Benjamin Minixhofer and Tyler Murray and Tomasz Limisiewicz and Anna Korhonen and Luke Zettlemoyer and Noah A. Smith and Edoardo M. Ponti and Luca Soldaini and Valentin Hofmann},
85
+ year={2025},
86
+ eprint={2512.15586},
87
+ archivePrefix={arXiv},
88
+ primaryClass={cs.CL},
89
+ url={https://arxiv.org/abs/2512.15586},
90
+ }
91
+ ```