Back to Search Start Over

Feature Activation through First Power Linear Unit with Sign.

Authors :
Duan, Boxi
Yang, Yufei
Dai, Xianhua
Source :
Electronics (2079-9292); Jul2022, Vol. 11 Issue 13, pN.PAG-N.PAG, 25p
Publication Year :
2022

Abstract

The activation function represents a crucial component in the design of a convolutional neural network (CNN). It enables the efficient extraction of multiple features from visual patterns, and introduces systemic non-linearity to data processing. This paper proposes a novel and insightful activation method termed FPLUS, which exploits mathematical power function with polar signs in form. It is enlightened by common inverse operations while endowed with an intuitive meaning of bionics. The formulation is derived theoretically under conditions of some prior knowledge and anticipative properties. Subsequently, its feasibility is verified through a series of experiments using typical benchmark datasets. The results indicate that our approach bears superior competitiveness among numerous activation functions, as well as compatible stability across many CNN architectures. Furthermore, we extend the function presented to a more generalized type called PFPLUS with two parameters that can be fixed or learnable, so as to augment its expressive capacity. The outcomes of identical tests serve to validate this improvement. Therefore, we believe the work in this paper holds a certain value in enriching the family of activation units. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20799292
Volume :
11
Issue :
13
Database :
Complementary Index
Journal :
Electronics (2079-9292)
Publication Type :
Academic Journal
Accession number :
157915493
Full Text :
https://doi.org/10.3390/electronics11131980