Back to Search Start Over

On asymptotic convergence rate of random search.

Authors :
Tarłowski, Dawid
Source :
Journal of Global Optimization; May2024, Vol. 89 Issue 1, p1-31, 31p
Publication Year :
2024

Abstract

This paper presents general theoretical studies on asymptotic convergence rate (ACR) for finite dimensional optimization. Given the continuous problem function and discrete time stochastic optimization process, the ACR is the optimal constant for control of the asymptotic behaviour of the expected approximation errors. Under general assumptions, condition ACR < 1 implies the linear behaviour of the expected time of hitting the ε - optimal sublevel set with ε → 0 + and determines the upper bound for the convergence rate of the trajectories of the process. This paper provides general characterization of ACR and, in particular, shows that some algorithms cannot converge linearly fast for any nontrivial continuous optimization problem. The relation between asymptotic convergence rate in the objective space and asymptotic convergence rate in the search space is provided. Examples and numerical simulations with use of a (1+1) self-adaptive evolution strategy and other algorithms are presented. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09255001
Volume :
89
Issue :
1
Database :
Complementary Index
Journal :
Journal of Global Optimization
Publication Type :
Academic Journal
Accession number :
176911118
Full Text :
https://doi.org/10.1007/s10898-023-01342-4