Back to Search Start Over

The Rise and Down of Babel Tower: Investigating the Evolution Process of Multilingual Code Large Language Model

Authors :
Chen, Jiawei
Chen, Wentao
Su, Jing
Xu, Jingjing
Lin, Hongyu
Ren, Mengjie
Lu, Yaojie
Han, Xianpei
Sun, Le
Publication Year :
2024

Abstract

Large language models (LLMs) have shown significant multilingual capabilities. However, the mechanisms underlying the development of these capabilities during pre-training are not well understood. In this paper, we use code LLMs as an experimental platform to explore the evolution of multilingual capabilities in LLMs during the pre-training process. Based on our observations, we propose the Babel Tower Hypothesis, which describes the entire process of LLMs acquiring new language capabilities. During the learning process, multiple languages initially share a single knowledge system dominated by the primary language and gradually develop language-specific knowledge systems. We then validate the above hypothesis by tracking the internal states of the LLMs through identifying working languages and language transferring neurons. Experimental results show that the internal state changes of the LLM are consistent with our Babel Tower Hypothesis. Building on these insights, we propose a novel method to construct an optimized pre-training corpus for multilingual code LLMs, which significantly outperforms LLMs trained on the original corpus. The proposed Babel Tower Hypothesis provides new insights into designing pre-training data distributions to achieve optimal multilingual capabilities in LLMs.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2412.07298
Document Type :
Working Paper