The efficiency threshold for the offspring population size of the (µ, λ) EA

Understanding when evolutionary algorithms are efficient or not, and how they efficiently solve problems, is one of the central research tasks in evolutionary computation. In this work, we make progress in understanding the interplay between parent and offspring population size of the (µ, λ) EA. Previous works, roughly speaking, indicate that for λ ≥ (1 + ε)eµ, this EA easily optimizes the OneMax function, whereas an offspring population size λ ≤ (1 - ε)eµ leads to an exponential runtime. Motivated also by the observation that in the efficient regime the (µ, λ) EA loses its ability to escape local optima, we take a closer look into this phase transition. Among other results, we show that when µ ≤ n1/2-c for any constant c > 0, then for any λ ≤ eµ we have a super-polynomial runtime. However, if µ ≥ n2/3+c, then for any λ ≥ eµ, the runtime is polynomial. For the latter result we observe that the (µ, λ) EA profits from better individuals also because these, by creating slightly worse offspring, stabilize slightly sub-optimal sub-populations. While these first results close to the phase transition do not yet give a complete picture, they indicate that the boundary between efficient and super-polynomial is not just the line λ = eµ, and that the reasons for efficiency or not are more complex than what was known so far.

[1]  Per Kristian Lehre,et al.  Negative Drift in Populations , 2010, PPSN.

[2]  Jens Jägersküpper,et al.  When the Plus Strategy Outperforms the Comma Strategyand When Not , 2007, 2007 IEEE Symposium on Foundations of Computational Intelligence.

[3]  Inman Harvey,et al.  Error Thresholds and Their Relation to Optimal Mutation Rates , 2022 .

[4]  Benjamin Doerr,et al.  A tight runtime analysis for the (μ + λ) EA , 2018, GECCO.

[5]  Thomas Jansen,et al.  Analyzing Evolutionary Algorithms: The Computer Science Perspective , 2012 .

[6]  B. Hajek Hitting-time and occupation-time bounds implied by drift analysis with applications , 1982, Advances in Applied Probability.

[7]  Benjamin Doerr,et al.  Multiplicative Up-Drift , 2019, Algorithmica.

[8]  Frank Neumann,et al.  Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2010, GECCO '12.

[9]  Duc-Cuong Dang,et al.  Level-Based Analysis of Genetic Algorithms and Other Search Processes , 2014, bioRxiv.

[10]  Duc-Cuong Dang,et al.  Runtime Analysis of Non-elitist Populations: From Classical Optimisation to Partial Information , 2016, Algorithmica.

[11]  Benjamin Doerr,et al.  The Efficiency Threshold for the Offspring Population Size of the ($\mu$, $\lambda$) EA , 2019, 1904.06981.

[12]  Xin Yao,et al.  Drift analysis and average time complexity of evolutionary algorithms , 2001, Artif. Intell..

[13]  Anne Auger,et al.  Theory of Randomized Search Heuristics: Foundations and Recent Developments , 2011, Theory of Randomized Search Heuristics.

[14]  Thomas Jansen,et al.  Analyzing Evolutionary Algorithms , 2015, Natural Computing Series.

[15]  Pietro Simone Oliveto,et al.  Improved time complexity analysis of the Simple Genetic Algorithm , 2015, Theor. Comput. Sci..

[16]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.

[17]  Per Kristian Lehre,et al.  Fitness-levels for non-elitist populations , 2011, GECCO '11.

[18]  Pietro Simone Oliveto,et al.  Theoretical analysis of fitness-proportional selection: landscapes and efficiency , 2009, GECCO.

[19]  Benjamin Doerr,et al.  Better Runtime Guarantees via Stochastic Domination , 2018, EvoCOP.

[20]  Pietro Simone Oliveto,et al.  Erratum: Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation , 2008, PPSN.

[21]  Takeshi Okano,et al.  On the rational approximations to e , 1983 .

[22]  Dirk Sudholt,et al.  The choice of the offspring population size in the (1, λ) evolutionary algorithm , 2014, Theor. Comput. Sci..