1. Analyzing Zero-shot Cross-lingual Transfer in Supervised NLP Tasks
- Author
-
Choi, Hyunjin, Kim, Judong, Joe, Seongho, Min, Seungjai, and Gwon, Youngjune
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence - Abstract
In zero-shot cross-lingual transfer, a supervised NLP task trained on a corpus in one language is directly applicable to another language without any additional training. A source of cross-lingual transfer can be as straightforward as lexical overlap between languages (e.g., use of the same scripts, shared subwords) that naturally forces text embeddings to occupy a similar representation space. Recently introduced cross-lingual language model (XLM) pretraining brings out neural parameter sharing in Transformer-style networks as the most important factor for the transfer. In this paper, we aim to validate the hypothetically strong cross-lingual transfer properties induced by XLM pretraining. Particularly, we take XLM-RoBERTa (XLMR) in our experiments that extend semantic textual similarity (STS), SQuAD and KorQuAD for machine reading comprehension, sentiment analysis, and alignment of sentence embeddings under various cross-lingual settings. Our results indicate that the presence of cross-lingual transfer is most pronounced in STS, sentiment analysis the next, and MRC the last. That is, the complexity of a downstream task softens the degree of crosslingual transfer. All of our results are empirically observed and measured, and we make our code and data publicly available., Comment: 6 pages, 4 figures, to be published in 25th International Conference on Pattern Recognition, ICPR 2020
- Published
- 2021