Bias and variance in continuous EDA

Estimation of Distribution Algorithms are based on statistical estimates. We show that when combining classical tools from statistics, namely bias/variance decomposition, reweighting and quasi-randomization, we can strongly improve the convergence rate. All modifications are easy, compliant with most algorithms, and experimentally very efficient in particular in the parallel case (large offsprings).

[1]  R. Cools,et al.  Good permutations for deterministic scrambled Halton sequences in terms of L2-discrepancy , 2006 .

[2]  Olivier Teytaud,et al.  When Does Quasi-random Work? , 2008, PPSN.

[3]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[4]  Olivier Teytaud,et al.  DCMA: yet another derandomization in covariance-matrix-adaptation , 2007, GECCO '07.

[5]  Olivier Teytaud,et al.  Lower Bounds for Evolution Strategies Using VC-Dimension , 2008, PPSN.

[6]  Olivier Teytaud,et al.  Why one must use reweighting in estimation of distribution algorithms , 2009, GECCO '09.

[7]  Dirk V. Arnold,et al.  Weighted multirecombination evolution strategies , 2006, Theor. Comput. Sci..

[8]  Pierre L'Ecuyer,et al.  Recent Advances in Randomized Quasi-Monte Carlo Methods , 2002 .

[9]  Pedro Larrañaga,et al.  Estimation of Distribution Algorithms , 2002, Genetic Algorithms and Evolutionary Computation.

[10]  Xin Yao,et al.  Unified eigen analysis on multivariate Gaussian based estimation of distribution algorithms , 2008, Inf. Sci..

[11]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[12]  Dirk V. Arnold,et al.  Cumulative Step Length Adaptation for Evolution Strategies Using Negative Recombination Weights , 2008, EvoWorkshops.

[13]  Harald Niederreiter,et al.  Random number generation and Quasi-Monte Carlo methods , 1992, CBMS-NSF regional conference series in applied mathematics.