Back to Search Start Over

An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks.

Authors :
Zhang, Yiqun
Xu, Honglei
Li, Yang
Lin, Gang
Zhang, Liyuan
Tao, Chaoyang
Wu, Yonghong
Source :
Algorithms. May2024, Vol. 17 Issue 5, p220. 16p.
Publication Year :
2024

Abstract

This paper proposes a new optimization algorithm for backpropagation (BP) neural networks by fusing integer-order differentiation and fractional-order differentiation, while fractional-order differentiation has significant advantages in describing complex phenomena with long-term memory effects and nonlocality, its application in neural networks is often limited by a lack of physical interpretability and inconsistencies with traditional models. To address these challenges, we propose a mixed integer-fractional (MIF) gradient descent algorithm for the training of neural networks. Furthermore, a detailed convergence analysis of the proposed algorithm is provided. Finally, numerical experiments illustrate that the new gradient descent algorithm not only speeds up the convergence of the BP neural networks but also increases their classification accuracy. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19994893
Volume :
17
Issue :
5
Database :
Academic Search Index
Journal :
Algorithms
Publication Type :
Academic Journal
Accession number :
177458643
Full Text :
https://doi.org/10.3390/a17050220