Back to Search Start Over

Novel Algorithms for Noisy Minimization Problems with Applications to Neural Networks Training.

Authors :
SIRLANTZIS, K.
LAMB, J. D.
LIU, W. B.
Source :
Journal of Optimization Theory & Applications; May2006, Vol. 129 Issue 2, p325-340, 16p, 4 Charts
Publication Year :
2006

Abstract

The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00223239
Volume :
129
Issue :
2
Database :
Complementary Index
Journal :
Journal of Optimization Theory & Applications
Publication Type :
Academic Journal
Accession number :
23770527
Full Text :
https://doi.org/10.1007/s10957-006-9066-z