Empirical Investigation of Simplified Step-Size Control in Metaheuristics with a View to Theory

Randomized direct-search methods for the optimization of a function f : Rn → R given by a black box for f-evaluations are investigated. We consider the cumulative step-size adaptation (CSA) for the variance of multivariate zero-mean normal distributions. Those are commonly used to sample new candidate solutions within metaheuristics, in particular within the CMA Evolution Strategy (CMA-ES), a state-of-the-art direct-search method. Though the CMA-ES is very successful in practical optimization, its theoretical foundations are very limited because of the complex stochastic process it induces. To forward the theory on this successful method, we propose two simplifications of the CSA used within CMA-ES for step-size control. We show by experimental and statistical evaluation that they perform sufficiently similarly to the original CSA (in the considered scenario), so that a further theoretical analysis is in fact reasonable. Furthermore, we outline in detail a probabilistic/theoretical runtime analysis for one of the two CSA-derivatives.

[1]  Jens Jägersküpper,et al.  Probabilistic runtime analysis of (1 +, λ),ES using isotropic mutations , 2006, GECCO '06.

[2]  Hans-Georg Beyer,et al.  Performance analysis of evolutionary optimization with cumulative step length adaptation , 2004, IEEE Transactions on Automatic Control.

[3]  Jens Jägersküpper Probabilistic runtime analysis of (1 +loverg, lambda), ES using isotropic mutations. , 2006 .

[4]  Nikolaus Hansen,et al.  Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[5]  Hans-Georg Beyer,et al.  The Theory of Evolution Strategies , 2001, Natural Computing Series.

[6]  Robin Milner,et al.  On Observing Nondeterminism and Concurrency , 1980, ICALP.

[7]  Jens Jägersküpper,et al.  Analysis of a Simple Evolutionary Algorithm for Minimization in Euclidean Spaces , 2003, ICALP.

[8]  Jens Jägersküpper,et al.  Algorithmic analysis of a basic evolutionary algorithm for continuous optimization , 2007, Theor. Comput. Sci..

[9]  Jens Jägersküpper,et al.  How the (1+1) ES using isotropic mutations minimizes positive definite quadratic forms , 2006, Theor. Comput. Sci..

[10]  Tamara G. Kolda,et al.  Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods , 2003, SIAM Rev..

[11]  Thomas Jansen,et al.  On the analysis of the (1+1) evolutionary algorithm , 2002, Theor. Comput. Sci..