An analysis of post-selection in automatic configuration

Automated algorithm configuration methods have proven to be instrumental in deriving high-performing algorithms and such methods are increasingly often used to configure evolutionary algorithms. One major challenge in devising automatic algorithm configuration techniques is to handle the inherent stochasticity in the configuration problems. This article analyses a post-selection mechanism that can also be used for this task. The central idea of the post-selection mechanism is to generate in a first phase a set of high-quality candidate algorithm configurations and then to select in a second phase from this candidate set the (statistically) best configuration. Our analysis of this mechanism indicates its high potential and suggests that it may be helpful to improve automatic algorithm configuration methods.

[1]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[2]  Thomas Stützle,et al.  F-Race and Iterated F-Race: An Overview , 2010, Experimental Methods for the Analysis of Optimization Algorithms.

[3]  Carlos Ansótegui,et al.  A Gender-Based Genetic Algorithm for the Automatic Configuration of Algorithms , 2009, CP.

[4]  Mauro Birattari,et al.  Tuning Metaheuristics - A Machine Learning Perspective , 2009, Studies in Computational Intelligence.

[5]  Thomas Stützle,et al.  MADS/F-Race: Mesh Adaptive Direct Search Meets F-Race , 2010, IEA/AIE.

[6]  Thomas Bartz-Beielstein,et al.  Experimental Methods for the Analysis of Optimization Algorithms , 2010 .

[7]  Thomas Stützle,et al.  A Racing Algorithm for Configuring Metaheuristics , 2002, GECCO.

[8]  Zhi Yuan,et al.  Automated Parameter Tuning Framework for Heterogeneous and Large Instances: Case Study in Quadratic Assignment Problem , 2013, LION.

[9]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[10]  Thomas Stützle,et al.  MAX-MIN Ant System , 2000, Future Gener. Comput. Syst..

[11]  Charles Audet,et al.  Mesh Adaptive Direct Search Algorithms for Constrained Optimization , 2006, SIAM J. Optim..

[12]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[13]  A. E. Eiben,et al.  Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.

[14]  Thomas Stützle,et al.  Continuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms , 2011, Swarm Intelligence.

[15]  Thomas Stützle,et al.  Improvement Strategies for the F-Race Algorithm: Sampling Design and Iterative Refinement , 2007, Hybrid Metaheuristics.

[16]  Kevin Leyton-Brown,et al.  Tradeoffs in the empirical evaluation of competing algorithm designs , 2010, Annals of Mathematics and Artificial Intelligence.

[17]  Andrew W. Moore,et al.  Hoeffding Races: Accelerating Model Selection Search for Classification and Function Approximation , 1993, NIPS.

[18]  E. D. Taillard,et al.  Ant Systems , 1999 .

[19]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[20]  Kevin P. Murphy,et al.  An experimental investigation of model-based parameter optimisation: SPO and beyond , 2009, GECCO.

[21]  M. Powell The BOBYQA algorithm for bound constrained optimization without derivatives , 2009 .

[22]  Manuel Laguna,et al.  Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search , 2006, Oper. Res..