The Efficiency Threshold for the Offspring Population Size of the ($\mu$, $\lambda$) EA

Understanding when evolutionary algorithms are efficient or not, and how they efficiently solve problems, is one of the central research tasks in evolutionary computation. In this work, we make progress in understanding the interplay between parent and offspring population size of the $(\mu,\lambda)$ EA. Previous works, roughly speaking, indicate that for $\lambda \ge (1+\varepsilon) e \mu$, this EA easily optimizes the OneMax function, whereas an offspring population size $\lambda \le (1 -\varepsilon) e \mu$ leads to an exponential runtime. Motivated also by the observation that in the efficient regime the $(\mu,\lambda)$ EA loses its ability to escape local optima, we take a closer look into this phase transition. Among other results, we show that when $\mu \le n^{1/2 - c}$ for any constant $c > 0$, then for any $\lambda \le e \mu$ we have a super-polynomial runtime. However, if $\mu \ge n^{2/3 + c}$, then for any $\lambda \ge e \mu$, the runtime is polynomial. For the latter result we observe that the $(\mu,\lambda)$ EA profits from better individuals also because these, by creating slightly worse offspring, stabilize slightly sub-optimal sub-populations. While these first results close to the phase transition do not yet give a complete picture, they indicate that the boundary between efficient and super-polynomial is not just the line $\lambda = e \mu$, and that the reasons for efficiency or not are more complex than what was known so far.

[1]  Thomas Jansen,et al.  Analyzing Evolutionary Algorithms: The Computer Science Perspective , 2012 .

[2]  C. S. Davis Rational approximations to e , 1978, Journal of the Australian Mathematical Society.

[3]  Mehryar Mohri,et al.  Tight Lower Bound on the Probability of a Binomial Exceeding its Expectation , 2013, ArXiv.

[4]  Jens Jägersküpper,et al.  When the Plus Strategy Outperforms the Comma Strategyand When Not , 2007, 2007 IEEE Symposium on Foundations of Computational Intelligence.

[5]  Benjamin Doerr,et al.  Better Runtime Guarantees via Stochastic Domination , 2018, EvoCOP.

[6]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.

[7]  Per Kristian Lehre,et al.  Negative Drift in Populations , 2010, PPSN.

[8]  Duc-Cuong Dang,et al.  Level-Based Analysis of Genetic Algorithms and Other Search Processes , 2016 .

[9]  Frank Neumann,et al.  Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2012, GECCO '12.

[10]  Pietro Simone Oliveto,et al.  Erratum: Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation , 2008, PPSN.

[11]  Duc-Cuong Dang,et al.  Runtime Analysis of Non-elitist Populations: From Classical Optimisation to Partial Information , 2016, Algorithmica.

[12]  Xin Yao,et al.  Drift analysis and average time complexity of evolutionary algorithms , 2001, Artif. Intell..

[13]  Johannes Lengler,et al.  Drift Analysis , 2017, Theory of Evolutionary Computation.

[14]  Dirk Sudholt,et al.  The choice of the offspring population size in the (1, λ) evolutionary algorithm , 2014, Theor. Comput. Sci..

[15]  B. Hajek Hitting-time and occupation-time bounds implied by drift analysis with applications , 1982, Advances in Applied Probability.

[16]  Per Kristian Lehre,et al.  Fitness-levels for non-elitist populations , 2011, GECCO '11.

[17]  Pietro Simone Oliveto,et al.  Theoretical analysis of fitness-proportional selection: landscapes and efficiency , 2009, GECCO.