Back to Search Start Over

Sentence-Level Agreement for Neural Machine Translation

Authors :
Masao Utiyama
Rui Wang
Tiejun Zhao
Eiichiro Sumita
Kehai Chen
Mingming Yang
Min Zhang
Source :
ACL (1)
Publication Year :
2019
Publisher :
Association for Computational Linguistics, 2019.

Abstract

The training objective of neural machine translation (NMT) is to minimize the loss between the words in the translated sentences and those in the references. In NMT, there is a natural correspondence between the source sentence and the target sentence. However, this relationship has only been represented using the entire neural network and the training objective is computed in word-level. In this paper, we propose a sentence-level agreement module to directly minimize the difference between the representation of source and target sentence. The proposed agreement module can be integrated into NMT as an additional training objective function and can also be used to enhance the representation of the source sentences. Empirical results on the NIST Chinese-to-English and WMT English-to-German tasks show the proposed agreement module can significantly improve the NMT performance.

Details

Database :
OpenAIRE
Journal :
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Accession number :
edsair.doi...........e6fbbdbbd1dcc968221fd76ce85a3b3d
Full Text :
https://doi.org/10.18653/v1/p19-1296