Leveraging benchmarking data for informed one-shot dynamic algorithm selection

A key challenge in the application of evolutionary algorithms in practice is the selection of an algorithm instance that best suits the problem at hand. What complicates this decision further is that different algorithms may be best suited for different stages of the optimization process. Dynamic algorithm selection and configuration are therefore well-researched topics in evolutionary computation. Two different settings are classically considered: hyper-heuristics and parameter control studies typically assume a setting in which the algorithm needs to be chosen and adjusted during the run, without prior information, other approaches such as hyper-parameter tuning and automated algorithm configuration assume the possibility of evaluating different configurations before making a final recommendation. In practical applications of evolutionary algorithms we are often in a middle-ground between these two settings, where one needs to decide upon the algorithm instance before the run ("oneshot" setting), but where we have (possibly lots of) data available on which we can base an informed decision. We analyze in this work how such prior performance data can be used to infer informed dynamic algorithm selection schemes for the solution of pseudo-Boolean optimization problems. Our specific use-case considers a family of genetic algorithms.

[1]  Heike Trautmann,et al.  Automated Algorithm Selection: Survey and Perspectives , 2018, Evolutionary Computation.

[2]  Benjamin Doerr,et al.  Fast genetic algorithms , 2017, GECCO.

[3]  Marius Lindauer,et al.  Dynamic Algorithm Configuration: Foundation of a New Meta-Algorithmic Framework , 2020, ECAI.

[4]  Thomas Bäck,et al.  IOHanalyzer: Performance Analysis for Iterative Optimization Heuristic , 2020, ArXiv.

[5]  Thomas Bäck,et al.  Automatic Configuration of Deep Neural Networks with Parallel Efficient Global Optimization , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[6]  Thomas Stützle,et al.  F-Race and Iterated F-Race: An Overview , 2010, Experimental Methods for the Analysis of Optimization Algorithms.

[7]  Thomas Bartz-Beielstein,et al.  Experimental Methods for the Analysis of Optimization Algorithms , 2010 .

[8]  Thomas Bäck,et al.  Towards dynamic algorithm selection for numerical black-box optimization: investigating BBOB as a use case , 2020, GECCO.

[9]  Hao Wang,et al.  IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics , 2018, ArXiv.

[10]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[11]  R. Geoff Dromey,et al.  An algorithm for the selection problem , 1986, Softw. Pract. Exp..

[12]  Ofer M. Shir,et al.  Benchmarking discrete optimization heuristics with IOHprofiler , 2019, GECCO.

[13]  Thomas Weise,et al.  Difficult features of combinatorial optimization problems and the tunable w-model benchmark problem for simulating them , 2018, GECCO.

[14]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[15]  Mark Hoogendoorn,et al.  Parameter Control in Evolutionary Algorithms: Trends and Challenges , 2015, IEEE Transactions on Evolutionary Computation.

[16]  Thomas Jansen,et al.  Analysis of evolutionary algorithms: from computational complexity analysis to algorithm engineering , 2011, FOGA '11.

[17]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[18]  Benjamin Doerr,et al.  The (1+λ) evolutionary algorithm with self-adjusting mutation rate , 2017, GECCO.

[19]  Nelishia Pillay,et al.  Hyper-Heuristics: Theory and Applications , 2018, Natural Computing Series.

[20]  Frank Neumann,et al.  Optimal Fixed and Adaptive Mutation Rates for the LeadingOnes Problem , 2010, PPSN.

[21]  Benjamin Doerr,et al.  Optimal Parameter Choices via Precise Black-Box Analysis , 2016, GECCO.

[22]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[23]  Per Kristian Lehre,et al.  Escaping Local Optima Using Crossover With Emergent Diversity , 2018, IEEE Transactions on Evolutionary Computation.

[24]  Anne Auger,et al.  COCO: a platform for comparing continuous optimizers in a black-box setting , 2016, Optim. Methods Softw..

[25]  Ofer M. Shir,et al.  Benchmarking discrete optimization heuristics with IOHprofiler , 2020, Appl. Soft Comput..

[26]  Thomas Bäck,et al.  Online selection of CMA-ES variants , 2019, GECCO.

[27]  Thomas Jansen,et al.  On the Black-Box Complexity of Example Functions: The Real Jump Function , 2015, FOGA.