Back to Search Start Over

Self-organizing radial basis function neural network using accelerated second-order learning algorithm.

Authors :
Han, Hong-Gui
Ma, Miao-Li
Yang, Hong-Yan
Qiao, Jun-Fei
Source :
Neurocomputing. Jan2022, Vol. 469, p1-12. 12p.
Publication Year :
2022

Abstract

Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, it is still difficult to avoid vanishing gradient to improve the learning performance in the training process. For this reason, in this paper, an accelerated second-order learning (ASOL) algorithm is developed to train RBFNN. First, an adaptive expansion and pruning mechanism (AEPM) of gradient space, based on the integrity and orthogonality of hidden neurons, is designed. Then, the effective gradient information is constantly added to gradient space and the redundant gradient information is eliminated from gradient space. Second, with AEPM, the neurons are generated or pruned accordingly. In this way, a self-organizing RBFNN (SORBFNN) which reduces the structure complexity and improves the generalization ability is obtained. Then, the structure and parameters in the learning process can be optimized by the proposed ASOL-based SORBFNN (ASOL-SORBFNN). Third, some theoretical analyses including the efficiency of the proposed AEPM on avoiding the vanishing gradient and the stability of SORBFNN in the process of structural adjustment are given, then the successful application of the proposed ASOL-SORBFNN is guaranteed. Finally, to illustrate the advantages of the proposed ASOL-SORBFNN, several experimental studies are examined. By comparing with other existing approaches, the results show that ASOL-SORBFNN performs well in terms of both learning speed and prediction accuracy. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
469
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
153825648
Full Text :
https://doi.org/10.1016/j.neucom.2021.10.065