Back to Search Start Over

HW-TSC's Submission to the CCMT 2024 Machine Translation Tasks

Authors :
Wu, Zhanglin
Luo, Yuanchang
Wei, Daimeng
Zheng, Jiawei
Wei, Bin
Li, Zongyao
Shang, Hengchao
Guo, Jiaxin
Li, Shaojun
Zhang, Weidong
Xie, Ning
Yang, Hao
Publication Year :
2024

Abstract

This paper presents the submission of Huawei Translation Services Center (HW-TSC) to machine translation tasks of the 20th China Conference on Machine Translation (CCMT 2024). We participate in the bilingual machine translation task and multi-domain machine translation task. For these two translation tasks, we use training strategies such as regularized dropout, bidirectional training, data diversification, forward translation, back translation, alternated training, curriculum learning, and transductive ensemble learning to train neural machine translation (NMT) models based on the deep Transformer-big architecture. Furthermore, to explore whether large language model (LLM) can help improve the translation quality of NMT systems, we use supervised fine-tuning to train llama2-13b as an Automatic post-editing (APE) model to improve the translation results of the NMT model on the multi-domain machine translation task. By using these plyometric strategies, our submission achieves a competitive result in the final evaluation.<br />Comment: 14 pages, 2 figures, 6 Tables, CCMT2024. arXiv admin note: substantial text overlap with arXiv:2409.14800

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.14842
Document Type :
Working Paper