Back to Search Start Over

Exploring the Impact of Layer Normalization for Zero-shot Neural Machine Translation

Authors :
Mao, Zhuoyuan
Dabre, Raj
Liu, Qianying
Song, Haiyue
Chu, Chenhui
Kurohashi, Sadao
Publication Year :
2023

Abstract

This paper studies the impact of layer normalization (LayerNorm) on zero-shot translation (ZST). Recent efforts for ZST often utilize the Transformer architecture as the backbone, with LayerNorm at the input of layers (PreNorm) set as the default. However, Xu et al. (2019) has revealed that PreNorm carries the risk of overfitting the training data. Based on this, we hypothesize that PreNorm may overfit supervised directions and thus have low generalizability for ZST. Through experiments on OPUS, IWSLT, and Europarl datasets for 54 ZST directions, we demonstrate that the original Transformer setting of LayerNorm after residual connections (PostNorm) consistently outperforms PreNorm by up to 12.3 BLEU points. We then study the performance disparities by analyzing the differences in off-target rates and structural variations between PreNorm and PostNorm. This study highlights the need for careful consideration of the LayerNorm setting for ZST.<br />Comment: Accepted to ACL 2023 main conference

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.09312
Document Type :
Working Paper