Noisy optimization complexity under locality assumption

In spite of various recent publications on the subject, there are still gaps between upper and lower bounds in evolutionary optimization for noisy objective function. In this paper we reduce the gap, and get tight bounds within logarithmic factors in the case of small noise and no long-distance influence on the objective function.

[1]  Olivier Teytaud,et al.  Lower Bounds for Comparison Based Evolution Strategies Using VC-dimension and Sign Patterns , 2011, Algorithmica.

[2]  Thomas Bäck,et al.  Evolution Strategies on Noisy Functions: How to Improve Convergence Properties , 1994, PPSN.

[3]  László Györfi,et al.  A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.

[4]  Hans-Georg Beyer,et al.  The Theory of Evolution Strategies , 2001, Natural Computing Series.

[5]  Olivier Teytaud,et al.  Bandit-Based Estimation of Distribution Algorithms for Noisy Optimization: Rigorous Runtime Analysis , 2010, LION.

[6]  Hans-Georg Beyer,et al.  Efficiency and Mutation Strength Adaptation of the (mu, muI, lambda)-ES in a Noisy Environment , 2000, PPSN.

[7]  John Shawe-Taylor,et al.  Regret Bounds for Gaussian Process Bandit Problems , 2010, AISTATS 2010.

[8]  Rémi Coulom,et al.  CLOP: Confident Local Optimization for Noisy Black-Box Parameter Tuning , 2011, ACG.

[9]  H.-G. Beyer,et al.  Mutate large, but inherit small ! On the analysis of rescaled mutations in (1, λ)-ES with noisy fitness data , 1998 .

[10]  Anne Auger,et al.  Convergence results for the (1, lambda)-SA-ES using the theory of phi-irreducible Markov chains , 2005, Theor. Comput. Sci..

[11]  Hans-Georg Beyer,et al.  Evolution strategies with cumulative step length adaptation on the noisy parabolic ridge , 2008, Natural Computing.

[12]  Eric Walter,et al.  Global optimization of expensive-to-evaluate functions: an empirical comparison of two sampling criteria , 2009, J. Glob. Optim..

[13]  Olivier Teytaud,et al.  General Lower Bounds for Evolutionary Algorithms , 2006, PPSN.

[14]  Nataliya Sokolovska,et al.  Handling expensive optimization with large noise , 2011, FOGA '11.

[15]  Eric Walter,et al.  An informational approach to the global optimization of expensive-to-evaluate functions , 2006, J. Glob. Optim..

[16]  A. Auger Convergence results for the ( 1 , )-SA-ES using the theory of-irreducible Markov chains , 2005 .

[17]  Olivier Teytaud,et al.  Adaptive Noisy Optimization , 2010, EvoApplications.

[18]  Katya Scheinberg,et al.  Recent progress in unconstrained nonlinear optimization without derivatives , 1997, Math. Program..

[19]  Foundations of Genetic Algorithms, 11th International Workshop, FOGA 2011, Schwarzenberg, Austria, January 5-8, 2011, Proceedings , 2011, FOGA.

[20]  Eric Walter,et al.  Global optimization based on noisy evaluations: An empirical study of two statistical approaches , 2008 .

[21]  Hans-Georg Beyer,et al.  Local performance of the (1 + 1)-ES in a noisy environment , 2002, IEEE Trans. Evol. Comput..

[22]  Anne Auger,et al.  On Multiplicative Noise Models for Stochastic Search , 2008, PPSN.

[23]  Hans-Georg Beyer Mutate Large, But Inherit Small! On the Analysis of Rescaled Mutations in 1-lambda-ES with Noisy Fitness Data , 1998, PPSN.

[24]  J. Kiefer,et al.  Stochastic Estimation of the Maximum of a Regression Function , 1952 .

[25]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[26]  V. Fabian Stochastic Approximation of Minima with Improved Asymptotic Speed , 1967 .

[27]  J. Fitzpatrick,et al.  Genetic Algorithms in Noisy Environments , 2005, Machine Learning.

[28]  Olivier Teytaud,et al.  On the adaptation of noise level for stochastic optimization , 2007, 2007 IEEE Congress on Evolutionary Computation.

[29]  Jon A. Wellner,et al.  Weak Convergence and Empirical Processes: With Applications to Statistics , 1996 .

[30]  Hung Chen Lower Rate of Convergence for Locating a Maximum of a Function , 1988 .

[31]  Christian Igel,et al.  Uncertainty handling CMA-ES for reinforcement learning , 2009, GECCO.

[32]  Benjamin Doerr,et al.  Towards a Complexity Theory of Randomized Search Heuristics: Ranking-Based Black-Box Complexity , 2011, CSR.