Back to Search Start Over

Convolutional Self-Attention Network

Authors :
Yang, Baosong
Wang, Longyue
Wong, Derek F.
Chao, Lidia S.
Tu, Zhaopeng
Yang, Baosong
Wang, Longyue
Wong, Derek F.
Chao, Lidia S.
Tu, Zhaopeng
Publication Year :
2018

Abstract

Self-attention network (SAN) has recently attracted increasing interest due to its fully parallelized computation and flexibility in modeling dependencies. It can be further enhanced with multi-headed attention mechanism by allowing the model to jointly attend to information from different representation subspaces at different positions (Vaswani et al., 2017). In this work, we propose a novel convolutional self-attention network (CSAN), which offers SAN the abilities to 1) capture neighboring dependencies, and 2) model the interaction between multiple attention heads. Experimental results on WMT14 English-to-German translation task demonstrate that the proposed approach outperforms both the strong Transformer baseline and other existing works on enhancing the locality of SAN. Comparing with previous work, our model does not introduce any new parameters.<br />Comment: The least version of this paper has been uploaded to another link: arXiv:1904.03107

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1106318225
Document Type :
Electronic Resource