MMAS vs. population-based EA on a family of dynamic fitness functions

We study the behavior of a population-based EA and the Max-Min Ant System (MMAS) on a family of deterministically-changing fitness functions, where, in order to find the global optimum, the algorithms have to find specific local optima within each of a series of phases. In particular, we prove that a (2+1) EA with genotype diversity is able to find the global optimum of the Maze function, previously considered by Kötzing and Molter (PPSN 2012, 113--122), in polynomial time. This is then generalized to a hierarchy result stating that for every μ, a (μ+1) EA with genotype diversity is able to track a Maze function extended over a finite alphabet of μ symbols, whereas population size μ-1 is not sufficient. Furthermore, we show that MMAS does not require additional modifications to track the optimum of the finite-alphabet Maze functions, and, using a novel drift statement to simplify the analysis, reduce the required phase length of the Maze function.

[1]  Pietro Simone Oliveto,et al.  Analysis of diversity mechanisms for optimisation in dynamic environments with low frequencies of change , 2013, GECCO '13.

[2]  Thomas Jansen,et al.  Theoretical analysis of a mutation-based evolutionary algorithm for a tracking problem in the lattice , 2005, GECCO '05.

[3]  Frank Neumann,et al.  Bioinspired computation in combinatorial optimization: algorithms and their computational complexity , 2012, GECCO '12.

[4]  Thomas Stützle,et al.  MAX-MIN Ant System , 2000, Future Gener. Comput. Syst..

[5]  Timo Kötzing,et al.  ACO Beats EA on a Dynamic Pseudo-Boolean Function , 2012, PPSN.

[6]  Enrique Alba,et al.  Metaheuristics for Dynamic Optimization , 2012, Metaheuristics for Dynamic Optimization.

[7]  Carsten Witt,et al.  Bioinspired Computation in Combinatorial Optimization , 2010, Bioinspired Computation in Combinatorial Optimization.

[8]  Carsten Witt,et al.  Fitness levels with tail bounds for the analysis of randomized search heuristics , 2014, Inf. Process. Lett..

[9]  Thomas Jansen,et al.  Evolutionary algorithms and artificial immune systems on a bi-stable dynamic optimisation problem , 2014, GECCO.

[10]  Christian Gunia,et al.  On the analysis of the approximation capability of simple evolutionary algorithms for scheduling problems , 2005, GECCO '05.

[11]  Per Kristian Lehre,et al.  General Drift Analysis with Tail Bounds , 2013, ArXiv.

[12]  Anne Auger,et al.  Theory of Randomized Search Heuristics: Foundations and Recent Developments , 2011, Theory of Randomized Search Heuristics.

[13]  Dirk Sudholt,et al.  Running time analysis of Ant Colony Optimization for shortest path problems , 2012, J. Discrete Algorithms.

[14]  Tobias Storch,et al.  On the Choice of the Parent Population Size , 2008, Evolutionary Computation.

[15]  Shengxiang Yang,et al.  Evolutionary dynamic optimization: A survey of the state of the art , 2012, Swarm Evol. Comput..

[16]  Stefan Droste,et al.  Design and Management of Complex Technical Processes and Systems by Means of Computational Intelligence Methods Analysis of the (1+1) Ea for a Dynamically Bitwise Changing Onemax Analysis of the (1+1) Ea for a Dynamically Bitwise Changing Onemax , 2003 .

[17]  B. Hajek Hitting-time and occupation-time bounds implied by drift analysis with applications , 1982, Advances in Applied Probability.

[18]  Per Kristian Lehre,et al.  Dynamic evolutionary optimisation: an analysis of frequency and magnitude of change , 2009, GECCO.

[19]  Dirk Sudholt,et al.  Using markov-chain mixing time estimates for the analysis of ant colony optimization , 2011, FOGA '11.

[20]  Benjamin Doerr,et al.  Run-time analysis of the (1+1) evolutionary algorithm optimizing linear functions over a finite alphabet , 2012, GECCO '12.