1. InkubaLM: A small language model for low-resource African languages
- Author
-
Tonja, Atnafu Lambebo, Dossou, Bonaventure F. P., Ojo, Jessica, Rajab, Jenalea, Thior, Fadel, Wairagala, Eric Peter, Aremu, Anuoluwapo, Moiloa, Pelonomi, Abbott, Jade, Marivate, Vukosi, and Rosman, Benjamin
- Subjects
Computer Science - Computation and Language - Abstract
High-resource language models often fall short in the African context, where there is a critical need for models that are efficient, accessible, and locally relevant, even amidst significant computing and data constraints. This paper introduces InkubaLM, a small language model with 0.4 billion parameters, which achieves performance comparable to models with significantly larger parameter counts and more extensive training data on tasks such as machine translation, question-answering, AfriMMLU, and the AfriXnli task. Notably, InkubaLM outperforms many larger models in sentiment analysis and demonstrates remarkable consistency across multiple languages. This work represents a pivotal advancement in challenging the conventional paradigm that effective language models must rely on substantial resources. Our model and datasets are publicly available at https://huggingface.co/lelapa to encourage research and development on low-resource languages.
- Published
- 2024