Back to Search Start Over

Dual-Alignment Pre-training for Cross-lingual Sentence Embedding

Authors :
Li, Ziheng
Huang, Shaohan
Zhang, Zihan
Deng, Zhi-Hong
Lou, Qiang
Huang, Haizhen
Jiao, Jian
Wei, Furu
Deng, Weiwei
Zhang, Qi
Publication Year :
2023

Abstract

Recent studies have shown that dual encoder models trained with the sentence-level translation ranking task are effective methods for cross-lingual sentence embedding. However, our research indicates that token-level alignment is also crucial in multilingual scenarios, which has not been fully explored previously. Based on our findings, we propose a dual-alignment pre-training (DAP) framework for cross-lingual sentence embedding that incorporates both sentence-level and token-level alignment. To achieve this, we introduce a novel representation translation learning (RTL) task, where the model learns to use one-side contextualized token representation to reconstruct its translation counterpart. This reconstruction objective encourages the model to embed translation information into the token representation. Compared to other token-level alignment methods such as translation language modeling, RTL is more suitable for dual encoder architectures and is computationally efficient. Extensive experiments on three sentence-level cross-lingual benchmarks demonstrate that our approach can significantly improve sentence embedding. Our code is available at https://github.com/ChillingDream/DAP.<br />Comment: ACL 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.09148
Document Type :
Working Paper