Back to Search Start Over

Attention Mechanism with Energy-Friendly Operations

Authors :
Wan, Yu
Yang, Baosong
Liu, Dayiheng
Xiao, Rong
Wong, Derek F.
Zhang, Haibo
Chen, Boxing
Chao, Lidia S.
Wan, Yu
Yang, Baosong
Liu, Dayiheng
Xiao, Rong
Wong, Derek F.
Zhang, Haibo
Chen, Boxing
Chao, Lidia S.
Publication Year :
2022

Abstract

Attention mechanism has become the dominant module in natural language processing models. It is computationally intensive and depends on massive power-hungry multiplications. In this paper, we rethink variants of attention mechanism from the energy consumption aspects. After reaching the conclusion that the energy costs of several energy-friendly operations are far less than their multiplication counterparts, we build a novel attention model by replacing multiplications with either selective operations or additions. Empirical results on three machine translation tasks demonstrate that the proposed model, against the vanilla one, achieves competitable accuracy while saving 99\% and 66\% energy during alignment calculation and the whole attention procedure. Code is available at: https://github.com/NLP2CT/E-Att.<br />Comment: Findings@ACL2022

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333766782
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.18653.v1.2022.findings-acl.313