Back to Search Start Over

Analysis of Different Types of Regret in Continuous Noisy Optimization

Authors :
Sandra Astete-Morales
Marie-Liesse Cauwet
Olivier Teytaud
Laboratoire de Recherche en Informatique (LRI)
Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)
Machine Learning and Optimisation (TAO)
Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France
Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)
T. Friedrich and F. Neumann
Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France
Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université Paris-Sud - Paris 11 (UP11)-Laboratoire de Recherche en Informatique (LRI)
Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-CentraleSupélec
Source :
GECCO, Genetic and Evolutionary Computation Conference 2016, Genetic and Evolutionary Computation Conference 2016, Jul 2016, Denver, United States. pp.205-212
Publication Year :
2016

Abstract

The performance measure of an algorithm is a crucial part of its analysis. The performance can be determined by the study on the convergence rate of the algorithm in question. It is necessary to study some (hopefully convergent) sequence that will measure how "good" is the approximated optimum compared to the real optimum. The concept of Regret is widely used in the bandit literature for assessing the performance of an algorithm. The same concept is also used in the framework of optimization algorithms, sometimes under other names or without a specific name. And the numerical evaluation of convergence rate of noisy algorithms often involves approximations of regrets. We discuss here two types of approximations of Simple Regret used in practice for the evaluation of algorithms for noisy optimization. We use specific algorithms of different nature and the noisy sphere function to show the following results. The approximation of Simple Regret, termed here Approximate Simple Regret, used in some optimization testbeds, fails to estimate the Simple Regret convergence rate. We also discuss a recent new approximation of Simple Regret, that we term Robust Simple Regret, and show its advantages and disadvantages.<br />Genetic and Evolutionary Computation Conference 2016, Jul 2016, Denver, United States. 2016

Details

Language :
English
Database :
OpenAIRE
Journal :
GECCO, Genetic and Evolutionary Computation Conference 2016, Genetic and Evolutionary Computation Conference 2016, Jul 2016, Denver, United States. pp.205-212
Accession number :
edsair.doi.dedup.....62f1e38a2f3ee26c4ed2674dd4d6f67f