Back to Search Start Over

Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning

Authors :
Singh, Shivalika
Vargus, Freddie
Dsouza, Daniel
Karlsson, Börje F.
Mahendiran, Abinaya
Ko, Wei-Yin
Shandilya, Herumb
Patel, Jay
Mataciunas, Deividas
OMahony, Laura
Zhang, Mike
Hettiarachchi, Ramith
Wilson, Joseph
Machado, Marina
Moura, Luisa Souza
Krzemiński, Dominik
Fadaei, Hakimeh
Ergün, Irem
Okoh, Ifeoma
Alaagib, Aisha
Mudannayake, Oshan
Alyafeai, Zaid
Chien, Vu Minh
Ruder, Sebastian
Guthikonda, Surya
Alghamdi, Emad A.
Gehrmann, Sebastian
Muennighoff, Niklas
Bartolo, Max
Kreutzer, Julia
Üstün, Ahmet
Fadaee, Marzieh
Hooker, Sara
Publication Year :
2024

Abstract

Datasets are foundational to many breakthroughs in modern artificial intelligence. Many recent achievements in the space of natural language processing (NLP) can be attributed to the finetuning of pre-trained models on a diverse set of tasks that enables a large language model (LLM) to respond to instructions. Instruction fine-tuning (IFT) requires specifically constructed and annotated datasets. However, existing datasets are almost all in the English language. In this work, our primary goal is to bridge the language gap by building a human-curated instruction-following dataset spanning 65 languages. We worked with fluent speakers of languages from around the world to collect natural instances of instructions and completions. Furthermore, we create the most extensive multilingual collection to date, comprising 513 million instances through templating and translating existing datasets across 114 languages. In total, we contribute four key resources: we develop and open-source the Aya Annotation Platform, the Aya Dataset, the Aya Collection, and the Aya Evaluation Suite. The Aya initiative also serves as a valuable case study in participatory research, involving collaborators from 119 countries. We see this as a valuable framework for future research collaborations that aim to bridge gaps in resources.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.06619
Document Type :
Working Paper