Back to Search Start Over

Stochastic configuration networks with chaotic maps and hierarchical learning strategy.

Authors :
Qiao, Jinghui
Chen, Yuxi
Source :
Information Sciences. Jun2023, Vol. 629, p96-108. 13p.
Publication Year :
2023

Abstract

Stochastic configuration networks (SCNs) have universal approximation capability and fast modeling properties, which have been successfully employed in large-scale data analytics. Based on SCNs, Stochastic configuration networks with block increments (BSC) use the node block increments mechanism to improve training speed but increase the complexity of the model. This paper presents a parallel configuration method (PCM), develops an extension of the original BSC with chaos theory and proposes stochastic configuration networks with chaotic maps (SCNCM), and establishes a hierarchical learning strategy (HLS) to enhance the compactness and construction speed of the model. Firstly, PCM randomly assigns the input weights w and biases b of hidden layer nodes by using uniform and normal distributions. In PCM, an iterative learning algorithm is intended to generate the scope control set and improve configuration efficiency. Secondly, the paper presents two kinds of stochastic configuration networks with chaotic maps, which are SCNCM-I and SCNCM-II. SCNCM-I adjusts block size by using multiple error values and chaotic maps to improve the training speed. Based on SCNCM-I, SCNCM-II utilizes node removal mechanism to enhance the compactness. Finally, HLS integrates with SCNCM-I, SCNCM-II, and the Harris-hawks optimization algorithm (HHO). The purpose of training is to enhance the training speed and compactness for three algorithms. The experiments are conducted on four benchmark data sets and an industrial application shows its effectiveness. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
629
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
162396260
Full Text :
https://doi.org/10.1016/j.ins.2023.01.128