Back to Search
Start Over
Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
- Publication Year :
- 2021
- Publisher :
- arXiv, 2021.
-
Abstract
- Despite their success, large pre-trained multilingual models have not completely alleviated the need for labeled data, which is cumbersome to collect for all target languages. Zero-shot cross-lingual transfer is emerging as a practical solution: pre-trained models later fine-tuned on one transfer language exhibit surprising performance when tested on many target languages. English is the dominant source language for transfer, as reinforced by popular zero-shot benchmarks. However, this default choice has not been systematically vetted. In our study, we compare English against other transfer languages for fine-tuning, on two pre-trained multilingual models (mBERT and mT5) and multiple classification and question answering tasks. We find that other high-resource languages such as German and Russian often transfer more effectively, especially when the set of target languages is diverse or unknown a priori. Unexpectedly, this can be true even when the training sets were automatically translated from English. This finding can have immediate impact on multilingual zero-shot systems, and should inform future benchmark designs.
Details
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....0bbdc29bafe20f9b71d9187ed48ac1ee
- Full Text :
- https://doi.org/10.48550/arxiv.2106.16171