A tight lower bound on the expected runtime of standard steady state genetic algorithms

Recent progress in the runtime analysis of evolutionary algorithms (EAs) has allowed the derivation of upper bounds on the expected runtime of standard steady-state GAs. These upper bounds have shown speed-ups of the GAs using crossover and mutation over the same algorithms that only use mutation operators (i.e., steady-state EAs) both for standard unimodal (i.e., OneMax) and multimodal (i.e., Jump) benchmark functions. These upper bounds suggest that populations are beneficial to the GA as well as higher mutation rates than the default 1/n rate. However, making rigorous claims was not possible because matching lower bounds were not available. Proving lower bounds on crossover-based EAs is a notoriously difficult task as it is hard to capture the progress that a diverse population can make. We use a potential function approach to prove a tight lower bound on the expected runtime of the (2 + 1) GA for OneMax for all mutation rates c/n with c < 1.422. This provides the last piece of the puzzle that completes the proof that larger population sizes improve the performance of the standard steady-state GA for OneMax for various mutation rates, and it proves that the optimal mutation rate for the (2 + 1) GA on OneMax is [EQUATION].

[1]  Dirk Sudholt,et al.  How crossover helps in pseudo-boolean optimization , 2011, GECCO '11.

[2]  Carola Doerr,et al.  A Simple Proof for the Usefulness of Crossover in Black-Box Optimization , 2018, PPSN.

[3]  Per Kristian Lehre,et al.  Escaping Local Optima Using Crossover With Emergent Diversity , 2018, IEEE Transactions on Evolutionary Computation.

[4]  Pietro Simone Oliveto,et al.  On the effectiveness of crossover for migration in parallel evolutionary algorithms , 2011, GECCO '11.

[5]  Johannes Lengler,et al.  A General Dichotomy of Evolutionary Algorithms on Monotone Functions , 2018, IEEE Transactions on Evolutionary Computation.

[6]  Pietro Simone Oliveto,et al.  On the Benefits of Populations for the Exploitation Speed of Standard Steady-State Genetic Algorithms , 2019, Algorithmica.

[7]  Dirk Sudholt,et al.  How Crossover Speeds up Building Block Assembly in Genetic Algorithms , 2014, Evolutionary Computation.

[8]  Per Kristian Lehre,et al.  Black-Box Search by Unbiased Variation , 2010, GECCO '10.

[9]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.

[10]  Andrew M. Sutton,et al.  Lower Bounds on the Runtime of Crossover-Based Algorithms via Decoupling and Family Graphs , 2019, Algorithmica.

[11]  Pietro Simone Oliveto,et al.  On the runtime analysis of the Simple Genetic Algorithm , 2014, Theor. Comput. Sci..

[12]  Andrew M. Sutton Crossover can simulate bounded tree search on a fixed-parameter tractable optimization problem , 2018, GECCO.

[13]  Benjamin Doerr,et al.  Crossover can provably be useful in evolutionary computation , 2008, GECCO '08.

[14]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[15]  Thomas Jansen,et al.  The Analysis of Evolutionary Algorithms—A Proof That Crossover Really Can Help , 2002, Algorithmica.

[16]  Per Kristian Lehre,et al.  Crossover can be constructive when computing unique input–output sequences , 2011, Soft Comput..

[17]  N. L. Johnson,et al.  A note on the mean deviation of the binomial distribution. , 1957 .

[18]  Pietro Simone Oliveto,et al.  Improved time complexity analysis of the Simple Genetic Algorithm , 2015, Theor. Comput. Sci..

[19]  Dogan Corus,et al.  Standard Steady State Genetic Algorithms Can Hillclimb Faster Than Mutation-Only Evolutionary Algorithms , 2017, IEEE Transactions on Evolutionary Computation.

[20]  Duc-Cuong Dang,et al.  Escaping Local Optima with Diversity Mechanisms and Crossover , 2016, GECCO.

[21]  Alexander Pollatsek,et al.  A theory of risk , 1970 .