Back to Search Start Over

Decomposition Techniques for Multilayer Perceptron Training.

Authors :
Grippo, Luigi
Manno, Andrea
Sciandrone, Marco
Source :
IEEE Transactions on Neural Networks & Learning Systems; Nov2016, Vol. 27 Issue 11, p2146-2159, 14p
Publication Year :
2016

Abstract

In this paper, we consider the learning problem of multilayer perceptrons (MLPs) formulated as the problem of minimizing a smooth error function. As well known, the learning problem of MLPs can be a difficult nonlinear nonconvex optimization problem. Typical difficulties can be the presence of extensive flat regions and steep sided valleys in the error surface, and the possible large number of training data and of free network parameters. We define a wide class of batch learning algorithms for MLP, based on the use of block decomposition techniques in the minimization of the error function. The learning problem is decomposed into a sequence of smaller and structured minimization problems in order to advantageously exploit the structure of the objective function. Theoretical convergence results are established, and a specific algorithm is constructed and evaluated through an extensive numerical experimentation. The comparisons with the state-of-the-art learning algorithms show the effectiveness of the proposed techniques. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
27
Issue :
11
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
119032840
Full Text :
https://doi.org/10.1109/TNNLS.2015.2475621