Parallel Problem Solving from Nature – PPSN XV

It is known that the (1 + 1)-EA with mutation rate c/n optimises every monotone function efficiently if c < 1, and needs exponential time on some monotone functions (HotTopic functions) if c > c0 = 2.13692... We study the same question for a large variety of algorithms, particularly for (1 + λ)-EA, (μ + 1)-EA, (μ + 1)-GA, their fast counterparts like fast (1 + 1)-EA, and for (1 + (λ, λ))-GA. We prove that all considered mutation-based algorithms show a similar dichotomy for HotTopic functions, or even for all monotone functions. For the (1 + (λ, λ))-GA, this dichotomy is in the parameter cγ, which is the expected number of bit flips in an individual after mutation and crossover, neglecting selection. For the fast algorithms, the dichotomy is in m2/m1, where m1 and m2 are the first and second falling moment of the number of bit flips. Surprisingly, the range of efficient parameters is not affected by either population size μ nor by the offspring population size λ. The picture changes completely if crossover is allowed. The genetic algorithms (μ + 1)-GA and (μ + 1)-fGA are efficient for arbitrary mutations strengths if μ is large enough.

[1]  Zhi-Hua Zhou,et al.  Analyzing Evolutionary Optimization in Noisy Environments , 2013, Evolutionary Computation.

[2]  G. Ochoa,et al.  Run-Time Analysis of Population-Based Evolutionary Algorithm in Noisy Environments , 2015 .

[3]  Per Kristian Lehre,et al.  Black-Box Search by Unbiased Variation , 2010, GECCO '10.

[4]  Xin Yao,et al.  Drift analysis and average time complexity of evolutionary algorithms , 2001, Artif. Intell..

[5]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[6]  Anne Auger,et al.  Theory of Randomized Search Heuristics: Foundations and Recent Developments , 2011, Theory of Randomized Search Heuristics.

[7]  Kenneth O. Stanley,et al.  A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks , 2009, Artificial Life.

[8]  Thomas Bartz-Beielstein,et al.  Distance Measures for Permutations in Combinatorial Efficient Global Optimization , 2014, PPSN.

[9]  Andrew M. Sutton,et al.  Robustness of Ant Colony Optimization to Noise , 2015, GECCO.

[10]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[11]  Anne Auger,et al.  On Multiplicative Noise Models for Stochastic Search , 2008, PPSN.

[12]  Raymond Ros,et al.  A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity , 2008, PPSN.

[13]  Kenji Kawaguchi,et al.  Deep Learning without Poor Local Minima , 2016, NIPS.

[14]  Christian Igel,et al.  Neuroevolution for reinforcement learning using evolution strategies , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[15]  Frank Neumann,et al.  Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2012, GECCO '12.

[16]  John J. Grefenstette,et al.  Evolutionary Algorithms for Reinforcement Learning , 1999, J. Artif. Intell. Res..

[17]  Yang Yu,et al.  Analysis of Noisy Evolutionary Optimization When Sampling Fails , 2020, Algorithmica.

[18]  Jürgen Branke,et al.  Evolutionary optimization in uncertain environments-a survey , 2005, IEEE Transactions on Evolutionary Computation.

[19]  Carola Doerr,et al.  Introducing Elitist Black-Box Models: When Does Elitist Behavior Weaken the Performance of Evolutionary Algorithms? , 2015, Evolutionary Computation.

[20]  Dirk Sudholt,et al.  The choice of the offspring population size in the (1, λ) evolutionary algorithm , 2014, Theor. Comput. Sci..

[21]  Ilya Loshchilov,et al.  A computationally efficient limited memory CMA-ES for large scale optimization , 2014, GECCO.

[22]  Andrew M. Sutton,et al.  The Compact Genetic Algorithm is Efficient Under Extreme Gaussian Noise , 2017, IEEE Transactions on Evolutionary Computation.

[23]  Benjamin Doerr,et al.  Fast genetic algorithms , 2017, GECCO.

[24]  Thomas Jansen,et al.  Mutation Rate Matters Even When Optimizing Monotonic Functions , 2013, Evolutionary Computation.

[25]  Dirk Sudholt,et al.  How Crossover Speeds up Building Block Assembly in Genetic Algorithms , 2014, Evolutionary Computation.

[26]  Xin Yao,et al.  On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments , 2014, Evolutionary Computation.

[27]  Risto Miikkulainen,et al.  Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.

[28]  Jens Jägersküpper,et al.  How the (1+1) ES using isotropic mutations minimizes positive definite quadratic forms , 2006, Theor. Comput. Sci..

[29]  Olivier Teytaud,et al.  General Lower Bounds for Evolutionary Algorithms , 2006, PPSN.

[30]  Duc-Cuong Dang,et al.  Efficient Optimisation of Noisy Fitness Functions with Population-based Evolutionary Algorithms , 2015, FOGA.

[31]  Ingo Rechenberg,et al.  Evolutionsstrategie : Optimierung technischer Systeme nach Prinzipien der biologischen Evolution , 1973 .

[32]  Chao Qian,et al.  Running Time Analysis of the (1+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1+1$$\end{document})-EA for OneMax an , 2017, Algorithmica.

[33]  Stefan Droste,et al.  Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Analysis of the (1+1) EA for a Noisy OneMax , 2004 .

[34]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.