Back to Search Start Over

Rethinking the Exploitation of Monolingual Data for Low-Resource Neural Machine Translation.

Authors :
Pang, Jianhui
Yang*, Baosong
Wong*, Derek Fai
Wan, Yu
Liu, Dayiheng
Chao, Lidia Sam
Xie, Jun
Source :
Computational Linguistics. Mar2024, Vol. 50 Issue 1, p25-47. 23p.
Publication Year :
2024

Abstract

The utilization of monolingual data has been shown to be a promising strategy for addressing low-resource machine translation problems. Previous studies have demonstrated the effectiveness of techniques such as back-translation and self-supervised objectives, including masked language modeling, causal language modeling, and denoise autoencoding, in improving the performance of machine translation models. However, the manner in which these methods contribute to the success of machine translation tasks and how they can be effectively combined remains an under-researched area. In this study, we carry out a systematic investigation of the effects of these techniques on linguistic properties through the use of probing tasks, including source language comprehension, bilingual word alignment, and translation fluency. We further evaluate the impact of pre-training, back-translation, and multi-task learning on bitexts of varying sizes. Our findings inform the design of more effective pipelines for leveraging monolingual data in extremely low-resource and low-resource machine translation tasks. Experiment results show consistent performance gains in seven translation directions, which provide further support for our conclusions and understanding of the role of monolingual data in machine translation. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08912017
Volume :
50
Issue :
1
Database :
Academic Search Index
Journal :
Computational Linguistics
Publication Type :
Academic Journal
Accession number :
176724094
Full Text :
https://doi.org/10.1162/coli_a_00496