Back to Search Start Over

On the Sparsity of Neural Machine Translation Models

Authors :
Wang, Yong
Wang, Longyue
Li, Victor O. K.
Tu, Zhaopeng
Publication Year :
2020

Abstract

Modern neural machine translation (NMT) models employ a large number of parameters, which leads to serious over-parameterization and typically causes the underutilization of computational resources. In response to this problem, we empirically investigate whether the redundant parameters can be reused to achieve better performance. Experiments and analyses are systematically conducted on different datasets and NMT architectures. We show that: 1) the pruned parameters can be rejuvenated to improve the baseline model by up to +0.8 BLEU points; 2) the rejuvenated parameters are reallocated to enhance the ability of modeling low-level lexical information.<br />Comment: EMNLP 2020

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2010.02646
Document Type :
Working Paper