Generalized jump functions

Jump functions are the most studied non-unimodal benchmark in the theory of evolutionary algorithms (EAs). They have significantly improved our understanding of how EAs escape from local optima. However, their particular structure - to leave the local optimum the EA can only jump directly to the global optimum - raises the question of how representative the recent findings are. For this reason, we propose an extended class Jumpk,δ of jump functions that incorporate a valley of low fitness of width δ starting at distance k from the global optimum. We prove that several previous results extend to this more general class: for all k = o (n1/3) and δ ≤ k, the optimal mutation rate for the (1 + 1) EA is [EQUATION], and the fast (1 + 1) EA runs faster than the classical (1 + 1) EA by a factor super-exponential in δ. However, we also observe that some known results do not generalize: the randomized local search algorithm with stagnation detection, which is faster than the fast (1 + 1) EA by a factor polynomial in k on Jumpk, is slower by a factor polynomial in n on some Jumpk,δ instances. Computationally, the new class allows experiments with wider fitness valleys, especially when they lie further away from the global optimum.

[1]  Carsten Witt,et al.  Self-Adjusting Evolutionary Algorithms for Multimodal Optimization , 2020, Algorithmica.

[2]  Benjamin Doerr,et al.  A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima , 2021, GECCO.

[3]  Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution , 2021, GECCO.

[4]  C. Witt,et al.  Stagnation detection in highly multimodal fitness landscapes , 2021, GECCO.

[5]  Benjamin Doerr,et al.  Lower bounds from fitness levels made easy , 2021, GECCO.

[6]  C. Witt,et al.  Stagnation Detection with Randomized Local Search* , 2021, Evolutionary Computation.

[7]  Benjamin Doerr,et al.  Theoretical Analyses of Multiobjective Evolutionary Algorithms on Multimodal Objectives. , 2020, Evolutionary computation.

[8]  Benjamin Doerr,et al.  The Univariate Marginal Distribution Algorithm Copes Well with Deception and Epistasis , 2020, Evolutionary Computation.

[9]  Markus Wagner,et al.  Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations , 2018, Natural Computing.

[10]  Benjamin Doerr,et al.  First Steps Towards a Runtime Analysis When Starting With a Good Solution , 2020, PPSN.

[11]  Benjamin Doerr,et al.  Runtime Analysis of a Heavy-Tailed (1+(λ, λ)) Genetic Algorithm on Jump Functions , 2020, PPSN.

[12]  Benjamin Doerr,et al.  The (1 + (λ,λ)) GA is even faster on multimodal problems , 2020, GECCO.

[13]  Benjamin Doerr,et al.  Fixed-Target Runtime Analysis , 2020, Algorithmica.

[14]  Benjamin Doerr Does Comma Selection Help to Cope with Local Optima? , 2020, GECCO.

[15]  Thomas Bäck,et al.  Theory of Evolutionary Computation: Recent Developments in Discrete Optimization , 2020, Theory of Evolutionary Computation.

[16]  Aishwaryaprajna,et al.  The benefits and limitations of voting mechanisms in evolutionary optimisation , 2019, FOGA '19.

[17]  Per Kristian Lehre,et al.  On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help , 2019, FOGA '19.

[18]  Andrei Lissovoi,et al.  On the Time Complexity of Algorithm Selection Hyper-Heuristics for Multimodal Optimisation , 2019, AAAI.

[19]  Benjamin Doerr,et al.  Analyzing randomized search heuristics via stochastic domination , 2019, Theor. Comput. Sci..

[20]  Benjamin Doerr,et al.  A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost , 2019, GECCO.

[21]  Markus Wagner,et al.  Heavy-Tailed Mutation Operators in Single-Objective Combinatorial Optimization , 2018, PPSN.

[22]  Chao Qian,et al.  Dynamic Mutation Based Pareto Optimization for Subset Selection , 2018, ICIC.

[23]  Markus Wagner,et al.  Escaping large deceptive basins of attraction with heavy-tailed mutation operators , 2018, GECCO.

[24]  Andrew M. Sutton,et al.  On the runtime dynamics of the compact genetic algorithm on jump functions , 2018, GECCO.

[25]  Pietro Simone Oliveto,et al.  Fast Artificial Immune Systems , 2018, PPSN.

[26]  Per Kristian Lehre,et al.  Escaping Local Optima Using Crossover With Emergent Diversity , 2018, IEEE Transactions on Evolutionary Computation.

[27]  Dogan Corus,et al.  Standard Steady State Genetic Algorithms Can Hillclimb Faster Than Mutation-Only Evolutionary Algorithms , 2017, IEEE Transactions on Evolutionary Computation.

[28]  Benjamin Doerr,et al.  Static and Self-Adjusting Mutation Strengths for Multi-valued Decision Variables , 2018, Algorithmica.

[29]  Pietro Simone Oliveto,et al.  How to Escape Local Optima in Black Box Optimisation: When Non-elitism Outperforms Elitism , 2017, Algorithmica.

[30]  Pietro Simone Oliveto,et al.  On the runtime analysis of the opt-IA artificial immune system , 2017, GECCO.

[31]  Benjamin Doerr,et al.  Fast genetic algorithms , 2017, GECCO.

[32]  Dirk Sudholt,et al.  How Crossover Speeds up Building Block Assembly in Genetic Algorithms , 2014, Evolutionary Computation.

[33]  Frank Neumann,et al.  Fast Building Block Assembly by Majority Vote Crossover , 2016, GECCO.

[34]  Duc-Cuong Dang,et al.  Escaping Local Optima with Diversity Mechanisms and Crossover , 2016, GECCO.

[35]  Dirk Sudholt,et al.  Towards a Runtime Comparison of Natural and Artificial Evolution , 2015, Algorithmica.

[36]  Solving Problems with Unknown Solution Length at (Almost) No Extra Cost , 2015, GECCO.

[37]  Benjamin Doerr,et al.  From black-box complexity to designing new genetic algorithms , 2015, Theor. Comput. Sci..

[38]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.

[39]  Dirk Sudholt,et al.  A New Method for Lower Bounds on the Running Time of Evolutionary Algorithms , 2011, IEEE Transactions on Evolutionary Computation.

[40]  Frank Neumann,et al.  Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2010, GECCO '12.

[41]  Thomas Jansen,et al.  Analyzing Evolutionary Algorithms: The Computer Science Perspective , 2012 .

[42]  Per Kristian Lehre,et al.  Negative Drift in Populations , 2010, PPSN.

[43]  Benjamin Doerr,et al.  Crossover can provably be useful in evolutionary computation , 2008, GECCO '08.

[44]  Jens Jägersküpper,et al.  When the Plus Strategy Outperforms the Comma Strategyand When Not , 2007, 2007 IEEE Symposium on Foundations of Computational Intelligence.

[45]  Dirk Sudholt,et al.  Crossover is provably essential for the Ising model on trees , 2005, GECCO '05.

[46]  Ingo Wegener,et al.  The Ising Model on the Ring: Mutation Versus Recombination , 2004, GECCO.

[47]  Thomas Jansen,et al.  The Analysis of Evolutionary Algorithms—A Proof That Crossover Really Can Help , 2002, Algorithmica.

[48]  Thomas Jansen,et al.  On the analysis of the (1+1) evolutionary algorithm , 2002, Theor. Comput. Sci..

[49]  Ingo Wegener,et al.  Theoretical Aspects of Evolutionary Algorithms , 2001, ICALP.