Back to Search
Start Over
Concurrent subspace width optimization method for RBF neural network modeling.
- Source :
-
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2012 Feb; Vol. 23 (2), pp. 247-59. - Publication Year :
- 2012
-
Abstract
- Radial basis function neural networks (RBFNNs) are widely used in nonlinear function approximation. One of the challenges in RBFNN modeling is determining how to effectively optimize width parameters to improve approximation accuracy. To solve this problem, a width optimization method, concurrent subspace width optimization (CSWO), is proposed based on a decomposition and coordination strategy. This method decomposes the large-scale width optimization problem into several subspace optimization (SSO) problems, each of which has a single optimization variable and smaller training and validation data sets so as to greatly simplify optimization complexity. These SSOs can be solved concurrently, thus computational time can be effectively reduced. With top-level system coordination, the optimization of SSOs can converge to a consistent optimum, which is equivalent to the optimum of the original width optimization problem. The proposed method is tested with four mathematical examples and one practical engineering approximation problem. The results demonstrate the efficiency and robustness of CSWO in optimizing width parameters over the traditional width optimization methods.
Details
- Language :
- English
- ISSN :
- 2162-2388
- Volume :
- 23
- Issue :
- 2
- Database :
- MEDLINE
- Journal :
- IEEE transactions on neural networks and learning systems
- Publication Type :
- Academic Journal
- Accession number :
- 24808504
- Full Text :
- https://doi.org/10.1109/TNNLS.2011.2178560