Back to Search Start Over

Central Attention Mechanism for Convolutional Neural Networks.

Authors :
Geng, Y. X.
Wang, L.
Wang, Z. Y.
Wang, Y. G.
Source :
IAENG International Journal of Computer Science; Oct2024, Vol. 51 Issue 10, p1642-1648, 7p
Publication Year :
2024

Abstract

Model performance has been significantly enhanced by channel attention. The average pooling procedure creates skewness, lowering the performance of the network architecture. In the channel attention approach, average pooling is used to collect feature information to provide representative values. By leveraging the central limit theorem, we hypothesize that the strip-shaped average pooling operation will generate a one-dimensional tensor by considering the spatial position information of the feature map. The resulting tensor, obtained through average pooling, serves as the representative value for the features, mitigating skewness during the process. By incorporating the concept of the central limit theorem into the channel attention operation process, this study introduces a novel attention mechanism known as the "Central Attention Mechanism (CAM)." Instead of directly using average pooling to generate channel representative values, the central attention approach employs star-stripe average pooling to normalize multiple feature representative values into a single representative value. In this way, strip-shaped average pooling can be utilized to collect data and generate a one-dimensional tensor, while star-stripe average pooling can provide feature representative values based on different spatial directions. To generate channel attention for the complementary input features, the activation of the feature representation value is performed for each channel. Our attention approach is flexible and can be seamlessly incorporated into various traditional network structures. Through rigorous testing, we demonstrate the effectiveness of our attention strategy, which can be applied to a wide range of computer vision applications and outperforms previous attention techniques. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1819656X
Volume :
51
Issue :
10
Database :
Supplemental Index
Journal :
IAENG International Journal of Computer Science
Publication Type :
Academic Journal
Accession number :
180317796