When is it beneficial to reject improvements?

We investigate two popular trajectory-based algorithms from biology and physics to answer a question of general significance: when is it beneficial to reject improvements? A distinguishing factor of SSWM (Strong Selection Weak Mutation), a popular model from population genetics, compared to the Metropolis algorithm (MA), is that the former can reject improvements, while the latter always accepts them. We investigate when one strategy outperforms the other. Since we prove that both algorithms converge to the same stationary distribution, we concentrate on identifying a class of functions inducing large mixing times, where the algorithms will outperform each other over a long period of time. The outcome of the analysis is the definition of a function where SSWM is efficient, while Metropolis requires at least exponential time.

[1]  M. Kimura,et al.  On the probability of fixation of mutant genes in a population. , 1962, Genetics.

[2]  Dirk Sudholt,et al.  Hybridizing Evolutionary Algorithms with Variable-Depth Search to Overcome Local Optima , 2011, Algorithmica.

[3]  David J. Aldous,et al.  Lower bounds for covering times for reversible Markov chains and random walks on graphs , 1989 .

[4]  H. A. Orr,et al.  The population genetics of speciation: the evolution of hybrid incompatibilities. , 1995, Genetics.

[5]  Jim E. Smith,et al.  Coevolving Memetic Algorithms: A Review and Progress Report , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[6]  Xin Yao,et al.  Runtime Analysis of Evolutionary Algorithms for Discrete Optimization , 2011, Theory of Randomized Search Heuristics.

[7]  A. E. Hirsh,et al.  The application of statistical physics to evolutionary biology. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[8]  Dirk Sudholt,et al.  Using markov-chain mixing time estimates for the analysis of ant colony optimization , 2011, FOGA '11.

[9]  Elizabeth L. Wilmer,et al.  Markov Chains and Mixing Times , 2008 .

[10]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[11]  V. Climenhaga Markov chains and mixing times , 2013 .

[12]  Dirk Sudholt,et al.  Selection Limits to Adaptive Walks on Correlated Landscapes , 2016, Genetics.

[13]  Feller William,et al.  An Introduction To Probability Theory And Its Applications , 1950 .

[14]  Guan-Yu Chen,et al.  On the mixing time and spectral gap for birth and death chains , 2013, 1304.4346.

[15]  J. Gillespie MOLECULAR EVOLUTION OVER THE MUTATIONAL LANDSCAPE , 1984, Evolution; international journal of organic evolution.

[16]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1967 .

[17]  Mark Jerrum,et al.  The Markov chain Monte Carlo method: an approach to approximate counting and integration , 1996 .

[18]  Michael J. Dinneen,et al.  Runtime analysis to compare best-improvement and first-improvement in memetic algorithms , 2014, GECCO.

[19]  V. Pande,et al.  On the application of statistical physics to evolutionary biology. , 2009, Journal of theoretical biology.

[20]  Christian Gießen,et al.  Hybridizing evolutionary algorithms with opportunistic local search , 2013, GECCO '13.

[21]  Dirk Sudholt,et al.  Towards a Runtime Comparison of Natural and Artificial Evolution , 2015, Algorithmica.

[22]  Pietro Simone Oliveto,et al.  When Non-Elitism Outperforms Elitism for Crossing Fitness Valleys , 2016, GECCO.