Back to Search
Start Over
Exploring and comparing the best 'direct methods' for the efficient training of MLP-networks
- Source :
- IEEE Transactions on Neural Networks. Nov, 1996, Vol. 7 Issue 6, p1497, 3 p.
- Publication Year :
- 1996
-
Abstract
- It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow adaptivity to the patterns during the training. In this paper, we present a class of algorithms, which overcome the above difficulties by utilizing some 'direct' numerical methods for the computation of the matrices of weights. In particular, we investigate the performances of the algorithms FBFBK-LSB (the first part named for the authors' initials and the second meaning least-squares backpropagation) and iterative conjugate gradient singular-value decomposition (ICGSVD), respectively, introduced by Barmann and Biegler-Konig and by the authors. Numerical results on several benchmark problems show a major reliability and/or efficiency of our algorithm ICGSVD.
Details
- ISSN :
- 10459227
- Volume :
- 7
- Issue :
- 6
- Database :
- Gale General OneFile
- Journal :
- IEEE Transactions on Neural Networks
- Publication Type :
- Academic Journal
- Accession number :
- edsgcl.18966066