Back to Search Start Over

Adaptive Morphing Activation Function for Neural Networks.

Authors :
Herrera-Alcántara, Oscar
Arellano-Balderas, Salvador
Source :
Fractal & Fractional. Aug2024, Vol. 8 Issue 8, p444. 32p.
Publication Year :
2024

Abstract

A novel morphing activation function is proposed, motivated by the wavelet theory and the use of wavelets as activation functions. Morphing refers to the gradual change of shape to mimic several apparently unrelated activation functions. The shape is controlled by the fractional order derivative, which is a trainable parameter to be optimized in the neural network learning process. Given the morphing activation function, and taking only integer-order derivatives, efficient piecewise polynomial versions of several existing activation functions are obtained. Experiments show that the performance of polynomial versions PolySigmoid, PolySoftplus, PolyGeLU, PolySwish, and PolyMish is similar or better than their counterparts Sigmoid, Softplus, GeLU, Swish, and Mish. Furthermore, it is possible to learn the best shape from the data by optimizing the fractional-order derivative with gradient descent algorithms, leading to the study of a more general formula based on fractional calculus to build and adapt activation functions with properties useful in machine learning. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
25043110
Volume :
8
Issue :
8
Database :
Academic Search Index
Journal :
Fractal & Fractional
Publication Type :
Academic Journal
Accession number :
179380505
Full Text :
https://doi.org/10.3390/fractalfract8080444