Tuning as a means of assessing the benefits of new ideas in interplay with existing algorithmic modules

Introducing new algorithmic ideas is a key part of the continuous improvement of existing optimization algorithms. However, when introducing a new component into an existing algorithm, assessing its potential benefits is a challenging task. Often, the component is added to a default implementation of the underlying algorithm and compared against a limited set of other variants. This assessment ignores any potential interplay with other algorithmic ideas that share the same base algorithm, which is critical in understanding the exact contributions being made. We explore a more extensive procedure, which uses hyperparameter tuning as a means of assessing the benefits of new algorithmic components. This allows for a more robust analysis by not only focusing on the impact on performance, but also by investigating how this performance is achieved. We implement our suggestion in the context of the Modular CMA-ES framework, which was redesigned and extended to include some new modules and several new options for existing modules, mostly focused on the step-size adaptation method. Our analysis highlights the differences between these new modules, and identifies the situations in which they have the largest contribution.

[1]  Hao Wang,et al.  Mirrored orthogonal sampling with pairwise selection in evolution strategies , 2014, SAC.

[2]  Hao Wang,et al.  Algorithm configuration data mining for CMA evolution strategies , 2017, GECCO.

[3]  Antonio Bolufé Röhler,et al.  Evolution strategies with thresheld convergence , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[4]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Tutorial , 2016, ArXiv.

[5]  David W. Corne,et al.  Infeasibility and structural bias in Differential Evolution , 2019, Inf. Sci..

[6]  Anne Auger,et al.  A median success rule for non-elitist evolution strategies: study of feasibility , 2013, GECCO '13.

[7]  Hao Wang,et al.  Cooling Strategies for the Moment-Generating Function in Bayesian Global Optimization , 2018, 2018 IEEE Congress on Evolutionary Computation (CEC).

[8]  Thomas Bäck,et al.  Integrated vs. sequential approaches for selecting and tuning CMA-ES variants , 2020, GECCO.

[9]  Ameet Talwalkar,et al.  Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization , 2016, J. Mach. Learn. Res..

[10]  Anne Auger,et al.  Mirrored sampling in evolution strategies with weighted recombination , 2011, GECCO '11.

[11]  Nikolaus Hansen,et al.  Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed , 2009, GECCO '09.

[12]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[13]  Thomas Bäck,et al.  Mixed-integer evolution strategies for parameter optimization and their applications to medical image analysis , 2005 .

[14]  Thomas Stützle,et al.  Classification of Metaheuristics and Design of Experiments for the Analysis of Components , 2001 .

[15]  Oswin Krause,et al.  Qualitative and Quantitative Assessment of Step Size Adaptation Rules , 2017, FOGA '17.

[16]  Marc Schoenauer,et al.  Per instance algorithm configuration of CMA-ES with limited budget , 2017, GECCO.

[17]  Nikolaus Hansen,et al.  A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.

[18]  Olivier Teytaud,et al.  Algorithms (X, sigma, eta): Quasi-random Mutations for Evolution Strategies , 2005, Artificial Evolution.

[19]  Anne Auger,et al.  COCO: a platform for comparing continuous optimizers in a black-box setting , 2016, Optim. Methods Softw..

[20]  Yuri Malitsky,et al.  Model-Based Genetic Algorithms for Algorithm Configuration , 2015, IJCAI.

[21]  Penousal Machado,et al.  On the Evolution of Evolutionary Algorithms , 2004, EuroGP.

[22]  Talal Rahwan,et al.  Using the Shapley Value to Analyze Algorithm Portfolios , 2016, AAAI.

[23]  Thomas Bäck,et al.  IOHanalyzer: Performance Analysis for Iterative Optimization Heuristic , 2020, ArXiv.

[24]  Hao Wang,et al.  Evolving the structure of Evolution Strategies , 2016, 2016 IEEE Symposium Series on Computational Intelligence (SSCI).

[25]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[26]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[27]  Hao Wang,et al.  IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics , 2018, ArXiv.

[28]  Anne Auger,et al.  Mirrored Sampling and Sequential Selection for Evolution Strategies , 2010, PPSN.

[29]  Ilya Loshchilov,et al.  A computationally efficient limited memory CMA-ES for large scale optimization , 2014, GECCO.

[30]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[31]  Nikolaus Hansen,et al.  CMA-ES with Two-Point Step-Size Adaptation , 2008, ArXiv.

[32]  Nuno Lourenço,et al.  Evolving evolutionary algorithms , 2012, GECCO '12.

[33]  Tom Schaul,et al.  Exponential natural evolution strategies , 2010, GECCO '10.

[34]  Thomas Bartz-Beielstein,et al.  SPOT: An R Package For Automatic and Interactive Tuning of Optimization Algorithms by Sequential Parameter Optimization , 2010, ArXiv.

[35]  Dirk V. Arnold,et al.  Improving Evolution Strategies through Active Covariance Matrix Adaptation , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[36]  Kevin Leyton-Brown,et al.  Evaluating Component Solver Contributions to Portfolio-Based Algorithm Selectors , 2012, SAT.

[37]  Tom Schaul,et al.  Natural Evolution Strategies , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).