Back to Search Start Over

Pre-training neural machine translation with alignment information via optimal transport.

Authors :
Su, Xueping
Zhao, Xingkai
Ren, Jie
Li, Yunhong
Rätsch, Matthias
Source :
Multimedia Tools & Applications; May2024, Vol. 83 Issue 16, p48377-48397, 21p
Publication Year :
2024

Abstract

With the rapid development of globalization, the demand for translation between different languages is also increasing. Although pre-training has achieved excellent results in neural machine translation, the existing neural machine translation has almost no high-quality suitable for specific fields. Alignment information, so this paper proposes a pre-training neural machine translation with alignment information via optimal transport. First, this paper narrows the representation gap between different languages by using OTAP to generate domain-specific data for information alignment, and learns richer semantic information. Secondly, this paper proposes a lightweight model DR-Reformer, which uses Reformer as the backbone network, adds Dropout layers and Reduction layers, reduces model parameters without losing accuracy, and improves computational efficiency. Experiments on the Chinese and English datasets of AI Challenger 2018 and WMT-17 show that the proposed algorithm has better performance than existing algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13807501
Volume :
83
Issue :
16
Database :
Complementary Index
Journal :
Multimedia Tools & Applications
Publication Type :
Academic Journal
Accession number :
177079370
Full Text :
https://doi.org/10.1007/s11042-023-17479-z