Back to Search Start Over

Distributionally Robust Multilingual Machine Translation

Authors :
Zhou, Chunting
Levy, Daniel
Li, Xian
Ghazvininejad, Marjan
Neubig, Graham
Publication Year :
2021

Abstract

Multilingual neural machine translation (MNMT) learns to translate multiple language pairs with a single model, potentially improving both the accuracy and the memory-efficiency of deployed models. However, the heavy data imbalance between languages hinders the model from performing uniformly across language pairs. In this paper, we propose a new learning objective for MNMT based on distributionally robust optimization, which minimizes the worst-case expected loss over the set of language pairs. We further show how to practically optimize this objective for large translation corpora using an iterated best response scheme, which is both effective and incurs negligible additional computational cost compared to standard empirical risk minimization. We perform extensive experiments on three sets of languages from two datasets and show that our method consistently outperforms strong baseline methods in terms of average and per-language performance under both many-to-one and one-to-many translation settings.<br />Comment: Long paper accepted by EMNLP2021 main conference

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2109.04020
Document Type :
Working Paper