On the choice of the parameter control mechanism in the (1+(λ, λ)) genetic algorithm

The self-adjusting (1 + (λ, λ)) GA is the best known genetic algorithm for problems with a good fitness-distance correlation as in OneMax. It uses a parameter control mechanism for the parameter λ that governs the mutation strength and the number of offspring. However, on multimodal problems, the parameter control mechanism tends to increase λ uncontrollably. We study this problem and possible solutions to it using rigorous runtime analysis for the standard Jumpk benchmark problem class. The original algorithm behaves like a (1+n) EA whenever the maximum value λ = n is reached. This is ineffective for problems where large jumps are required. Capping λ at smaller values is beneficial for such problems. Finally, resetting λ to 1 allows the parameter to cycle through the parameter space. We show that this strategy is effective for all Jumpk problems: the (1 + (λ, λ)) GA performs as well as the (1 + 1) EA with the optimal mutation rate and fast evolutionary algorithms, apart from a small polynomial overhead. Along the way, we present new general methods for bounding the runtime of the (1 + (λ, λ)) GA that allows to translate existing runtime bounds from the (1 + 1) EA to the self-adjusting (1 + (λ, λ)) GA. Our methods are easy to use and give upper bounds for novel classes of functions.

[1]  Benjamin Doerr,et al.  From black-box complexity to designing new genetic algorithms , 2015, Theor. Comput. Sci..

[2]  Walter J. Gutjahr,et al.  First steps to the runtime complexity analysis of ant colony optimization , 2008, Comput. Oper. Res..

[3]  Maxim Buzdalov,et al.  The 1/5-th rule with rollbacks: on self-adjustment of the population size in the (1 + (λ, λ)) GA , 2019, GECCO.

[4]  Carsten Witt,et al.  Runtime Analysis of the ( μ +1) EA on Simple Pseudo-Boolean Functions , 2006 .

[5]  Dirk Sudholt,et al.  Adaptive population models for offspring populations and parallel evolutionary algorithms , 2011, FOGA '11.

[6]  Benjamin Doerr,et al.  A tight runtime analysis for the (1 + (λ, λ)) GA on leadingones , 2019, FOGA '19.

[7]  Dirk Sudholt,et al.  Runtime analysis of a binary particle swarm optimizer , 2010, Theor. Comput. Sci..

[8]  Benjamin Doerr,et al.  The (1+λ) evolutionary algorithm with self-adjusting mutation rate , 2017, GECCO.

[9]  Carola Doerr,et al.  Towards a More Practice-Aware Runtime Analysis of Evolutionary Algorithms , 2017, ArXiv.

[10]  Dirk Sudholt,et al.  General Upper Bounds on the Runtime of Parallel Evolutionary Algorithms* , 2014, Evolutionary Computation.

[11]  Benjamin Doerr,et al.  The ($$1+\lambda $$1+λ) Evolutionary Algorithm with Self-Adjusting Mutation Rate , 2018, Algorithmica.

[12]  Dirk Sudholt,et al.  Analysis of different MMAS ACO algorithms on unimodal functions and plateaus , 2009, Swarm Intelligence.

[13]  Per Kristian Lehre,et al.  Unbiased Black-Box Complexity of Parallel Search , 2014, PPSN.

[14]  Duc-Cuong Dang,et al.  Level-Based Analysis of Genetic Algorithms and Other Search Processes , 2014, bioRxiv.

[15]  Andrei Lissovoi,et al.  Simple Hyper-Heuristics Control the Neighbourhood Size of Randomised Local Search Optimally for LeadingOnes* , 2018, Evolutionary Computation.

[16]  Benjamin Doerr,et al.  Optimal Static and Self-Adjusting Parameter Choices for the (1+(λ,λ))\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$( , 2017, Algorithmica.

[17]  Ronald L. Rivest,et al.  Introduction to Algorithms, third edition , 2009 .

[18]  Benjamin Doerr,et al.  Runtime Analysis for Self-adaptive Mutation Rates , 2018, Algorithmica.

[19]  Benjamin Doerr,et al.  Multiplicative Up-Drift , 2019, Algorithmica.

[20]  Jörg Lässig,et al.  General Upper Bounds on the Running Time of Parallel Evolutionary Algorithms , 2012, ArXiv.

[21]  Dirk Sudholt,et al.  A New Method for Lower Bounds on the Running Time of Evolutionary Algorithms , 2011, IEEE Transactions on Evolutionary Computation.

[22]  Mario Alejandro,et al.  An empirical evaluation of success-based parameter control mechanisms for evolutionary algorithms , 2019, GECCO.

[23]  Carsten Witt,et al.  Runtime Analysis of the ( + 1) EA on Simple Pseudo-Boolean Functions , 2006, Evolutionary Computation.

[24]  Per Kristian Lehre,et al.  Black-Box Search by Unbiased Variation , 2010, GECCO '10.

[25]  Dirk Sudholt,et al.  Design and Analysis of Schemes for Adapting Migration Intervals in Parallel Evolutionary Algorithms , 2015, Evolutionary Computation.

[26]  Ofer M. Shir,et al.  Benchmarking discrete optimization heuristics with IOHprofiler , 2019, GECCO.

[27]  Carsten Witt,et al.  Fitness levels with tail bounds for the analysis of randomized search heuristics , 2014, Inf. Process. Lett..

[28]  Thomas Jansen,et al.  On the analysis of a dynamic evolutionary algorithm , 2006, J. Discrete Algorithms.

[29]  Benjamin Doerr,et al.  Self-Adjusting Mutation Rates with Provably Optimal Success Rules , 2019, Algorithmica.

[30]  Benjamin Doerr,et al.  Fast genetic algorithms , 2017, GECCO.

[31]  Frank Neumann,et al.  Optimal Fixed and Adaptive Mutation Rates for the LeadingOnes Problem , 2010, PPSN.

[32]  Thomas Jansen,et al.  The Analysis of Evolutionary Algorithms—A Proof That Crossover Really Can Help , 2002, Algorithmica.

[33]  William F. Punch,et al.  Parameter-less population pyramid , 2014, GECCO.

[34]  Per Kristian Lehre,et al.  Escaping Local Optima Using Crossover With Emergent Diversity , 2018, IEEE Transactions on Evolutionary Computation.

[35]  Dirk Sudholt,et al.  Analysis of speedups in parallel evolutionary algorithms and (1+λ) EAs for combinatorial optimization , 2014, Theor. Comput. Sci..

[36]  Benjamin Doerr,et al.  Runtime analysis of the (1 + (λ, λ)) genetic algorithm on random satisfiable 3-CNF formulas , 2017, GECCO.

[37]  Duc-Cuong Dang,et al.  Escaping Local Optima with Diversity Mechanisms and Crossover , 2016, GECCO.

[38]  Benjamin Doerr,et al.  Theory of Parameter Control for Discrete Black-Box Optimization: Provable Performance Gains Through Dynamic Parameter Choices , 2018, Theory of Evolutionary Computation.