Back to Search
Start Over
Decomposition Techniques for Multilayer Perceptron Training.
- Source :
-
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2016 Nov; Vol. 27 (11), pp. 2146-2159. Date of Electronic Publication: 2015 Sep 22. - Publication Year :
- 2016
-
Abstract
- In this paper, we consider the learning problem of multilayer perceptrons (MLPs) formulated as the problem of minimizing a smooth error function. As well known, the learning problem of MLPs can be a difficult nonlinear nonconvex optimization problem. Typical difficulties can be the presence of extensive flat regions and steep sided valleys in the error surface, and the possible large number of training data and of free network parameters. We define a wide class of batch learning algorithms for MLP, based on the use of block decomposition techniques in the minimization of the error function. The learning problem is decomposed into a sequence of smaller and structured minimization problems in order to advantageously exploit the structure of the objective function. Theoretical convergence results are established, and a specific algorithm is constructed and evaluated through an extensive numerical experimentation. The comparisons with the state-of-the-art learning algorithms show the effectiveness of the proposed techniques.
Details
- Language :
- English
- ISSN :
- 2162-2388
- Volume :
- 27
- Issue :
- 11
- Database :
- MEDLINE
- Journal :
- IEEE transactions on neural networks and learning systems
- Publication Type :
- Academic Journal
- Accession number :
- 26415186
- Full Text :
- https://doi.org/10.1109/TNNLS.2015.2475621