Analysing the Robustness of Evolutionary Algorithms to Noise: Refined Runtime Bounds and an Example Where Noise is Beneficial

We analyse the performance of well-known evolutionary algorithms, the $$(1+1)$$ ( 1 + 1 )  EA and the $$(1+\lambda )$$ ( 1 + λ )  EA, in the prior noise model, where in each fitness evaluation the search point is altered before the evaluation with probability  p . We present refined results for the expected optimisation time of these algorithms on the function Leading-Ones , where bits have to be optimised in sequence. Previous work showed that the $$(1+1)$$ ( 1 + 1 )  EA on Leading-Ones runs in polynomial expected time if $$p = O((\log n)/n^2)$$ p = O ( ( log n ) / n 2 ) and needs superpolynomial expected time if $$p = \omega ((\log n)/n)$$ p = ω ( ( log n ) / n ) , leaving a huge gap for which no results were known. We close this gap by showing that the expected optimisation time is $$\varTheta (n^2) \cdot \exp (\varTheta (\min \{pn^2, n\}))$$ Θ ( n 2 ) · exp ( Θ ( min { p n 2 , n } ) ) for all $$p \le 1/2$$ p ≤ 1 / 2 , allowing for the first time to locate the threshold between polynomial and superpolynomial expected times at $$p = \varTheta ((\log n)/n^2)$$ p = Θ ( ( log n ) / n 2 ) . Hence the $$(1+1)$$ ( 1 + 1 )  EA on Leading-Ones is surprisingly sensitive to noise. We also show that offspring populations of size $$\lambda \ge 3.42\log n$$ λ ≥ 3.42 log n can effectively deal with much higher noise than known before. Finally, we present an example of a rugged landscape where prior noise can help to escape from local optima by blurring the landscape and allowing a hill climber to see the underlying gradient. We prove that in this particular setting noise can have a highly beneficial effect on performance.

[1]  Thomas Jansen,et al.  On the analysis of the (1+1) evolutionary algorithm , 2002, Theor. Comput. Sci..

[2]  Hans-Georg Beyer,et al.  Why noise may be good: additive noise on the sharp ridge , 2008, GECCO '08.

[3]  Xin Yao,et al.  On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments , 2014, Evolutionary Computation.

[4]  Benjamin Doerr,et al.  Ants easily solve stochastic shortest path problems , 2012, GECCO '12.

[5]  M. Mitzenmacher,et al.  Probability and Computing: Chernoff Bounds , 2005 .

[6]  Frank Neumann,et al.  Computing single source shortest paths using single-objective fitness , 2009, FOGA '09.

[7]  Andrew M. Sutton,et al.  The Compact Genetic Algorithm is Efficient Under Extreme Gaussian Noise , 2017, IEEE Transactions on Evolutionary Computation.

[8]  Chao Qian,et al.  Towards a Running Time Analysis of the (1+1)-EA for OneMax and LeadingOnes Under General Bit-Wise Noise , 2018, PPSN.

[9]  Angelika Steger,et al.  Drift Analysis and Evolutionary Algorithms Revisited , 2016, Combinatorics, Probability and Computing.

[10]  Yang Yu,et al.  Analysis of Noisy Evolutionary Optimization When Sampling Fails , 2020, Algorithmica.

[11]  Pietro Simone Oliveto,et al.  On the runtime analysis of stochastic ageing mechanisms , 2014, GECCO.

[12]  Jürgen Branke,et al.  Evolutionary optimization in uncertain environments-a survey , 2005, IEEE Transactions on Evolutionary Computation.

[13]  Anne Auger,et al.  Log-Linear Convergence and Divergence of the Scale-Invariant (1+1)-ES in Noisy Environments , 2011, Algorithmica.

[14]  Dirk Sudholt,et al.  Design and analysis of migration in parallel evolutionary algorithms , 2013, Soft Comput..

[15]  Stefan Droste,et al.  Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Analysis of the (1+1) EA for a Noisy OneMax , 2004 .

[16]  Kenneth A. De Jong,et al.  Design and Management of Complex Technical Processes and Systems by Means of Computational Intelligence Methods on the Choice of the Offspring Population Size in Evolutionary Algorithms on the Choice of the Offspring Population Size in Evolutionary Algorithms , 2004 .

[17]  Jens Jägersküpper,et al.  When the Plus Strategy Outperforms the Comma Strategyand When Not , 2007, 2007 IEEE Symposium on Foundations of Computational Intelligence.

[18]  Marco Laumanns,et al.  Running time analysis of multiobjective evolutionary algorithms on pseudo-Boolean functions , 2004, IEEE Transactions on Evolutionary Computation.

[19]  Pietro Simone Oliveto,et al.  Erratum: Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation , 2008, PPSN.

[20]  Oliver Giel,et al.  Expected runtimes of a simple multi-objective evolutionary algorithm , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[21]  Per Kristian Lehre,et al.  Black-box Complexity of Parallel Search with Distributed Populations , 2015, FOGA.

[22]  Dirk Sudholt,et al.  A Simple Ant Colony Optimizer for Stochastic Shortest Path Problems , 2012, Algorithmica.

[23]  Frank Neumann,et al.  Population size matters: Rigorous runtime results for maximizing the hypervolume indicator , 2015, Theor. Comput. Sci..

[24]  Dirk Sudholt,et al.  The choice of the offspring population size in the (1, λ) evolutionary algorithm , 2014, Theor. Comput. Sci..

[25]  Adam Prügel-Bennett,et al.  When a genetic algorithm outperforms hill-climbing , 2004, Theor. Comput. Sci..

[26]  Olivier Teytaud,et al.  Analysis of runtime of optimization algorithms for noisy functions over discrete codomains , 2015, Theor. Comput. Sci..

[27]  Timo Kötzing,et al.  Optimizing expected path lengths with ant colony optimization using fitness proportional update , 2013, FOGA XII '13.

[28]  Duc-Cuong Dang,et al.  Efficient Optimisation of Noisy Fitness Functions with Population-based Evolutionary Algorithms , 2015, FOGA.

[29]  Dirk Sudholt,et al.  Towards a Runtime Comparison of Natural and Artificial Evolution , 2015, Algorithmica.

[30]  Chao Qian,et al.  Running Time Analysis of the (1+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1+1$$\end{document})-EA for OneMax an , 2017, Algorithmica.

[31]  Dirk Sudholt,et al.  Memetic algorithms beat evolutionary algorithms on the class of hurdle problems , 2018, GECCO.

[32]  Dirk Sudholt,et al.  Running time analysis of Ant Colony Optimization for shortest path problems , 2012, J. Discrete Algorithms.

[33]  Duc-Cuong Dang,et al.  Runtime Analysis of Non-elitist Populations: From Classical Optimisation to Partial Information , 2016, Algorithmica.

[34]  Andrew M. Sutton,et al.  Robustness of Ant Colony Optimization to Noise , 2015, GECCO.

[35]  Dirk Sudholt,et al.  When do evolutionary algorithms optimize separable functions in parallel? , 2013, FOGA XII '13.

[36]  L. Darrell Whitley,et al.  Searching in the Presence of Noise , 1996, PPSN.

[37]  C. R. Reeves,et al.  Landscapes, operators and heuristic search , 1999, Ann. Oper. Res..

[38]  Dirk Sudholt,et al.  On the robustness of evolutionary algorithms to noise: refined results and an example where noise helps , 2018, GECCO.

[39]  Thomas Jansen,et al.  Analysis of an Asymmetric Mutation Operator , 2010, Evolutionary Computation.

[40]  Thomas Jansen,et al.  Mutation Rate Matters Even When Optimizing Monotonic Functions , 2013, Evolutionary Computation.

[41]  Elizabeth L. Wilmer,et al.  Markov Chains and Mixing Times , 2008 .

[42]  Dorian Nogneng,et al.  A new analysis method for evolutionary optimization of dynamic and noisy objective functions , 2018, GECCO.

[43]  H. Beyer Evolutionary algorithms in noisy environments : theoretical issues and guidelines for practice , 2000 .

[44]  Thomas Jansen,et al.  Comparing global and local mutations on bit strings , 2008, GECCO '08.

[45]  Yang Yu,et al.  An analysis on recombination in multi-objective evolutionary optimization , 2013, Artif. Intell..

[46]  Gabriela Ochoa,et al.  Deconstructing the Big Valley Search Space Hypothesis , 2016, EvoCOP.

[47]  Zhi-Hua Zhou,et al.  Analyzing Evolutionary Optimization in Noisy Environments , 2013, Evolutionary Computation.

[48]  G. Ochoa,et al.  Run-Time Analysis of Population-Based Evolutionary Algorithm in Noisy Environments , 2015 .

[49]  Per Kristian Lehre,et al.  General Drift Analysis with Tail Bounds , 2013, ArXiv.

[50]  Vincenzo Cutello,et al.  An immune algorithm with stochastic aging and kullback entropy for the chromatic number problem , 2007, J. Comb. Optim..

[51]  Benjamin Doerr,et al.  Lower bounds for the runtime of a global multi-objective evolutionary algorithm , 2013, 2013 IEEE Congress on Evolutionary Computation.

[52]  Luca Maria Gambardella,et al.  A survey on metaheuristics for stochastic combinatorial optimization , 2009, Natural Computing.

[53]  Pietro Simone Oliveto,et al.  Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation , 2008, Algorithmica.

[54]  Frank Neumann,et al.  A rigorous view on neutrality , 2007, 2007 IEEE Congress on Evolutionary Computation.

[55]  Dirk Sudholt,et al.  Speeding up evolutionary multi-objective optimisation through diversity-based parent selection , 2017, GECCO.