Lower bounds on the run time of the Univariate Marginal Distribution Algorithm on OneMax

Abstract The Univariate Marginal Distribution Algorithm (UMDA) – a popular estimation-of-distribution algorithm – is studied from a run time perspective. On the classical OneMax benchmark function on bit strings of length n, a lower bound of Ω ( λ + μ n + n log ⁡ n ) , where μ and λ are algorithm-specific parameters, on its expected run time is proved. This is the first direct lower bound on the run time of UMDA. It is stronger than the bounds that follow from general black-box complexity theory and is matched by the run time of many evolutionary algorithms. The results are obtained through advanced analyses of the stochastic change of the frequencies of bit values maintained by the algorithm, including carefully designed potential functions. These techniques may prove useful in advancing the field of run time analysis for estimation-of-distribution algorithms in general.

[1]  Xin Yao,et al.  Rigorous time complexity analysis of Univariate Marginal Distribution Algorithm with margins , 2009, 2009 IEEE Congress on Evolutionary Computation.

[2]  Xin Yao,et al.  On the analysis of average time complexity of estimation of distribution algorithms , 2007, 2007 IEEE Congress on Evolutionary Computation.

[3]  Thomas Jansen,et al.  UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Upper and Lower Bounds for Randomized Search Heuristics in Black-Box Optimization , 2004 .

[4]  Per Kristian Lehre,et al.  Improved runtime bounds for the univariate marginal distribution algorithm via anti-concentration , 2017, GECCO.

[5]  William Feller,et al.  An Introduction to Probability Theory and Its Applications, Vol. 2 , 1967 .

[6]  Dirk Sudholt,et al.  A New Method for Lower Bounds on the Running Time of Evolutionary Algorithms , 2011, IEEE Transactions on Evolutionary Computation.

[7]  Martin Pelikan,et al.  An introduction and survey of estimation of distribution algorithms , 2011, Swarm Evol. Comput..

[8]  Jens Jägersküpper,et al.  When the Plus Strategy Outperforms the Comma Strategyand When Not , 2007, 2007 IEEE Symposium on Foundations of Computational Intelligence.

[9]  Dirk Sudholt,et al.  A few ants are enough: ACO with iteration-best update , 2010, GECCO '10.

[10]  Pietro Simone Oliveto,et al.  Improved time complexity analysis of the Simple Genetic Algorithm , 2015, Theor. Comput. Sci..

[11]  Duc-Cuong Dang,et al.  Simplified Runtime Analysis of Estimation of Distribution Algorithms , 2015, GECCO.

[12]  Tobias Friedrich,et al.  EDAs cannot be Balanced and Stable , 2016, GECCO.

[13]  Per Kristian Lehre,et al.  When is an estimation of distribution algorithm better than an evolutionary algorithm? , 2009, 2009 IEEE Congress on Evolutionary Computation.

[14]  W. Feller,et al.  An Introduction to Probability Theory and Its Applications, Vol. 1 , 1967 .

[15]  Stefan Droste,et al.  A rigorous analysis of the compact genetic algorithm for linear functions , 2006, Natural Computing.

[16]  Andrew M. Sutton,et al.  The Benefit of Recombination in Noisy Evolutionary Search , 2015, ISAAC.

[17]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[18]  Xin Yao,et al.  Analysis of Computational Time of Simple Estimation of Distribution Algorithms , 2010, IEEE Transactions on Evolutionary Computation.

[19]  Carsten Witt,et al.  Lower Bounds on the Run Time of the Univariate Marginal Distribution Algorithm on OneMax , 2017, FOGA '17.

[20]  Carsten Witt,et al.  Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions† , 2013, Combinatorics, Probability and Computing.

[21]  Dirk Sudholt,et al.  Update Strength in EDAs and ACO: How to Avoid Genetic Drift , 2016, GECCO.