Per instance algorithm configuration of CMA-ES with limited budget

Per Instance Algorithm Configuration (PIAC) relies on features that describe problem instances. It builds an Empirical Performance Model (EPM) from a training set made of (instance, parameter configuration) pairs together with the corresponding performance of the algorithm at hand. This paper presents a case study in the continuous black-box optimization domain, using features proposed in the literature. The target algorithm is CMA-ES, and three of its hyper-parameters. Special care is taken to the computational cost of the features. The EPM is learned on the BBOB benchmark, but tested on independent test functions gathered from the optimization literature. The results demonstrate that the proposed approach can outperform the default setting of CMA-ES with as few as 30 or 50 time the problem dimension additional function evaluations for feature computation.

[1]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[2]  A. E. Eiben,et al.  Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.

[3]  L. Darrell Whitley,et al.  The dispersion metric and the CMA evolution strategy , 2006, GECCO.

[4]  Bernd Bischl,et al.  Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement , 2015, GECCO.

[5]  Kevin Leyton-Brown,et al.  SATzilla: Portfolio-based Algorithm Selection for SAT , 2008, J. Artif. Intell. Res..

[6]  Michèle Sebag,et al.  Maximum Likelihood-Based Online Adaptation of Hyper-Parameters in CMA-ES , 2014, PPSN.

[7]  Klaus Schittkowski,et al.  Test examples for nonlinear programming codes , 1980 .

[8]  Anne Auger,et al.  Impacts of invariance in search: When CMA-ES and PSO face ill-conditioned and non-separable problems , 2011, Appl. Soft Comput..

[9]  Yuri Malitsky,et al.  ISAC - Instance-Specific Algorithm Configuration , 2010, ECAI.

[10]  Raymond Ros,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Experimental Setup , 2009 .

[11]  Kevin Leyton-Brown,et al.  Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms , 2006, CP.

[12]  F. Hutter,et al.  Hydra-MIP : Automated Algorithm Configuration and Selection for Mixed Integer Programming , 2011 .

[13]  Thomas Stützle,et al.  A Racing Algorithm for Configuring Metaheuristics , 2002, GECCO.

[14]  Marc Schoenauer,et al.  Feature Based Algorithm Configuration: A Case Study with Differential Evolution , 2016, PPSN.

[15]  Yuri Malitsky,et al.  Features for Exploiting Black-Box Optimization Problem Structure , 2013, LION.

[16]  Kevin Leyton-Brown,et al.  Algorithm runtime prediction: Methods & evaluation , 2012, Artif. Intell..

[17]  Anna Syberfeldt,et al.  Parameter tuned CMA-ES on the CEC'15 expensive problems , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[18]  Heike Trautmann,et al.  Low-Budget Exploratory Landscape Analysis on Multiple Peaks Models , 2016, GECCO.

[19]  Mario A. Muñoz,et al.  A Meta-learning Prediction Model of Algorithm Performance for Continuous Optimization Problems , 2012, PPSN.

[20]  Yuri Malitsky,et al.  Instance-specific algorithm configuration , 2014, Constraints.

[21]  D. Eichmann More Test Examples For Nonlinear Programming Codes , 2016 .

[22]  Kevin Leyton-Brown,et al.  Hydra: Automatically Configuring Algorithms for Portfolio-Based Selection , 2010, AAAI.

[23]  Bernd Bischl,et al.  Exploratory landscape analysis , 2011, GECCO '11.

[24]  Nikolaus Hansen,et al.  Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed , 2009, GECCO '09.

[25]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[26]  Saman K. Halgamuge,et al.  Exploratory Landscape Analysis of Continuous Space Optimization Problems Using Information Content , 2015, IEEE Transactions on Evolutionary Computation.

[27]  Yoav Shoham,et al.  Learning the Empirical Hardness of Optimization Problems: The Case of Combinatorial Auctions , 2002, CP.

[28]  Holger H. Hoos,et al.  Programming by optimization , 2012, Commun. ACM.

[29]  Ernesto P. Adorio,et al.  MVF - Multivariate Test Functions Library in C for Unconstrained Global Optimization , 2005 .

[30]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[31]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[32]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[33]  Kevin Leyton-Brown,et al.  : The Design and Analysis of an Algorithm Portfolio for SAT , 2007, CP.

[34]  Marc Schoenauer,et al.  Surrogate Assisted Feature Computation for Continuous Problems , 2016, LION.