Back to Search
Start Over
Xmodel-LM Technical Report
- Publication Year :
- 2024
-
Abstract
- We introduce Xmodel-LM, a compact and efficient 1.1B language model pre-trained on around 2 trillion tokens. Trained on our self-built dataset (Xdata), which balances Chinese and English corpora based on downstream task optimization, Xmodel-LM exhibits remarkable performance despite its smaller size. It notably surpasses existing open-source language models of similar scale. Our model checkpoints and code are publicly accessible on GitHub at https://github.com/XiaoduoAILab/XmodelLM.
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2406.02856
- Document Type :
- Working Paper