Datasets:

Languages:
English
ArXiv:
License:
bolmo_mix / README.md
benjamin's picture
Update README.md
5ce37de verified
metadata
license: odc-by
viewer: false
task_categories:
  - text-generation
pretty_name: Bolmo Training Mix (December 2025)
language:
  - en
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/**/*
  - config_name: cute_style_character
    data_files:
      - split: train
        path: data/cute_style_character/**/*
  - config_name: stack_edu
    data_files:
      - split: train
        path: data/stack_edu-*/**/*
  - config_name: common_crawl
    data_files:
      - split: train
        path: data/common_crawl-*/**/*
  - config_name: wikipedia
    data_files:
      - split: train
        path: data/dolma_1_7-wiki-en/**/*
  - config_name: extra_stack_edu
    data_files:
      - split: train
        path: data/extra_stack_edu/**/*
  - config_name: finemath-3plus
    data_files:
      - split: train
        path: data/finemath-3plus/**/*
  - config_name: olmocr_science_pdfs
    data_files:
      - split: train
        path: data/olmocr_science_pdfs-*/**/*
  - config_name: arxiv
    data_files:
      - split: train
        path: data/rpj-proofpile-arxiv/**/*

Bolmo Mix

Data used to train Bolmo, the first family of competitive fully open byte-level language models (LMs).

See our technical report for details: https://allenai.org/papers/bolmo.

Name Tokens License
Common Crawl 121.0B ODC-BY
olmOCR Science PDFs 19.9B ODC-BY
StackEdu 26.3B ODC-BY
FineMath 3+ 4.1B ODC-BY
arXiv 1.3B ODC-BY
Wikipedia & Wikibooks 64.6M ODC-BY
Character Understanding 75.5M ODC-BY
Total 172.7B

Bolmo models are trained for less than one epoch (~39.3B tokens) on this mix.

Licensing Information

Bolmo Mix is licensed under the Open Data Commons Attribution License v1.0 (ODC-By). It is intended for research and educational use. For more information, please see our Responsible Use Guidelines.

Citation

@misc{bolmo,
      title={Bolmo: Byteifying the Next Generation of Language Models}, 
      author={Benjamin Minixhofer and Tyler Murray and Tomasz Limisiewicz and Anna Korhonen and Luke Zettlemoyer and Noah A. Smith and Edoardo M. Ponti and Luca Soldaini and Valentin Hofmann},
      year={2025},
      eprint={2512.15586},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2512.15586}, 
}