Back to Search
Start Over
A derivative-free optimization algorithm for the efficient minimization of functions obtained via statistical averaging.
- Source :
- Computational Optimization & Applications; May2020, Vol. 76 Issue 1, p1-31, 31p
- Publication Year :
- 2020
-
Abstract
- This paper considers the efficient minimization of the infinite time average of a stationary ergodic process in the space of a handful of design parameters which affect it. Problems of this class, derived from physical or numerical experiments which are sometimes expensive to perform, are ubiquitous in engineering applications. In such problems, any given function evaluation, determined with finite sampling, is associated with a quantifiable amount of uncertainty, which may be reduced via additional sampling. The present paper proposes a new optimization algorithm to adjust the amount of sampling associated with each function evaluation, making function evaluations more accurate (and, thus, more expensive), as required, as convergence is approached. The work builds on our algorithm for Delaunay-based Derivative-free Optimization via Global Surrogates (Δ -DOGS, see JOGO 10.1007/s10898-015-0384-2). The new algorithm, dubbed α -DOGS, substantially reduces the overall cost of the optimization process for problems of this important class. Further, under certain well-defined conditions, rigorous proof of convergence to the global minimum of the problem considered is established. [ABSTRACT FROM AUTHOR]
- Subjects :
- MATHEMATICAL optimization
STATIONARY processes
GLOBAL optimization
ARITHMETIC mean
Subjects
Details
- Language :
- English
- ISSN :
- 09266003
- Volume :
- 76
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Computational Optimization & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 142513387
- Full Text :
- https://doi.org/10.1007/s10589-020-00172-4