On Spectral Invariance of Randomized Hessian and Covariance Matrix Adaptation Schemes

We evaluate the performance of several gradient-free variable-metric continuous optimization schemes on a specific set of quadratic functions. We revisit a randomized Hessian approximation scheme (D. Leventhal and A. S. Lewis. Randomized Hessian estimation and directional search, 2011), discuss its theoretical underpinnings, and introduce a novel, numerically stable implementation of the scheme (RH). For comparison we also consider closely related Covariance Matrix Adaptation (CMA) schemes. A key goal of this study is to elucidate the influence of the distribution of eigenvalues of quadratic functions on the convergence properties of the different variable-metric schemes. For this purpose we introduce a class of quadratic functions with parameterizable spectra. Our empirical study shows that (i) the performance of RH methods is less dependent on the spectral distribution than CMA schemes, (ii) that adaptive step size control is more efficient in the RH method than line search, and (iii) that the concept of the evolution path allows a paramount speed-up of CMA schemes on quadratic functions but does not alleviate the overall dependence on the eigenvalue spectrum. The present results may trigger research into the design of novel CMA update schemes with improved spectral invariance.

[1]  K. Marti,et al.  Controlled Random Search Procedures for Global Optimization , 2020, International Series in Operations Research & Management Science.

[2]  Christian L. Müller,et al.  Gaussian Adaptation Revisited - An Entropic View on Covariance Matrix Adaptation , 2010, EvoApplications.

[3]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[4]  K. Steiglitz,et al.  Adaptive step size random search , 1968 .

[5]  Robert Schaefer Parallel Problem Solving from Nature - PPSN XI, 11th International Conference, Kraków, Poland, September 11-15, 2010. Proceedings, Part II , 2010, PPSN.

[6]  Christian L. Müller,et al.  Gaussian Adaptation as a unifying framework for continuous black-box optimization and adaptive Monte Carlo sampling , 2010, IEEE Congress on Evolutionary Computation.

[7]  O. Nelles,et al.  An Introduction to Optimization , 1996, IEEE Antennas and Propagation Magazine.

[8]  Lars Taxén,et al.  Stochastic optimization in system design , 1981 .

[9]  Christian L. Müller,et al.  Optimization of Convex Functions with Random Pursuit , 2011, SIAM J. Optim..

[10]  Petros Koumoutsakos,et al.  Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.

[11]  Ingo Rechenberg,et al.  Evolutionsstrategie : Optimierung technischer Systeme nach Prinzipien der biologischen Evolution , 1973 .

[12]  Jens Jägersküpper,et al.  Rigorous Runtime Analysis of the (1+1) ES: 1/5-Rule and Ellipsoidal Fitness Landscapes , 2005, FOGA.

[13]  Christophe Andrieu,et al.  A tutorial on adaptive MCMC , 2008, Stat. Comput..

[14]  Anne Auger,et al.  Mirrored Sampling and Sequential Selection for Evolution Strategies , 2010, PPSN.

[15]  Yurii Nesterov,et al.  Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.

[16]  W. Vent,et al.  Rechenberg, Ingo, Evolutionsstrategie — Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. 170 S. mit 36 Abb. Frommann‐Holzboog‐Verlag. Stuttgart 1973. Broschiert , 1975 .

[17]  A. Lewis,et al.  Randomized Hessian estimation and directional search , 2011 .