Learning Feature-Parameter Mappings for Parameter Tuning via the Profile Expected Improvement

The majority of algorithms can be controlled or adjusted by parameters. Their values can substantially affect the algorithms' performance. Since the manual exploration of the parameter space is tedious -- even for few parameters -- several automatic procedures for parameter tuning have been proposed. Recent approaches also take into account some characteristic properties of the problem instances, frequently termed instance features. Our contribution is the proposal of a novel concept for feature-based algorithm parameter tuning, which applies an approximating surrogate model for learning the continuous feature-parameter mapping. To accomplish this, we learn a joint model of the algorithm performance based on both the algorithm parameters and the instance features. The required data is gathered using a recently proposed acquisition function for model refinement in surrogate-based optimization: the profile expected improvement. This function provides an avenue for maximizing the information required for the feature-parameter mapping, i.e., the mapping from instance features to the corresponding optimal algorithm parameters. The approach is validated by applying the tuner to exemplary evolutionary algorithms and problems, for which theoretically grounded or heuristically determined feature-parameter mappings are available.

[1]  Bernd Bischl,et al.  Tuning and evolution of support vector kernels , 2012, Evol. Intell..

[2]  Yuri Malitsky,et al.  ISAC - Instance-Specific Algorithm Configuration , 2010, ECAI.

[3]  Bernd Bischl,et al.  Algorithm selection based on exploratory landscape analysis and cost-sensitive learning , 2012, GECCO '12.

[4]  Kevin Leyton-Brown,et al.  Hydra: Automatically Configuring Algorithms for Portfolio-Based Selection , 2010, AAAI.

[5]  R JonesDonald,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998 .

[6]  A. E. Eiben,et al.  Evolutionary Algorithm Parameters and Methods to Tune Them , 2012, Autonomous Search.

[7]  Yuri Malitsky,et al.  Instance-specific algorithm configuration , 2014, Constraints.

[8]  George C. Runger,et al.  Using Experimental Design to Find Effective Parameter Settings for Heuristics , 2001, J. Heuristics.

[9]  Nikolaus Hansen,et al.  Evaluating the CMA Evolution Strategy on Multimodal Test Functions , 2004, PPSN.

[10]  Mauro Birattari,et al.  Tuning Metaheuristics - A Machine Learning Perspective , 2009, Studies in Computational Intelligence.

[11]  A. Forrester,et al.  Design and analysis of 'noisy' computer experiments , 2006 .

[12]  Thomas Jansen,et al.  On the analysis of the (1+1) evolutionary algorithm , 2002, Theor. Comput. Sci..

[13]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[14]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[15]  Heike Trautmann,et al.  Benchmarking Evolutionary Algorithms: Towards Exploratory Landscape Analysis , 2010, PPSN.

[16]  Simon Wessing,et al.  On the Effect of Response Transformations in Sequential Parameter Optimization , 2012, Evolutionary Computation.

[17]  Bernd Bischl,et al.  BatchJobs and BatchExperiments: Abstraction Mechanisms for Using R in Batch Environments , 2015 .

[18]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[19]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[20]  Bernd Bischl,et al.  Exploratory landscape analysis , 2011, GECCO '11.

[21]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[22]  Jean Baccou,et al.  Bayesian Adaptive Reconstruction of Profile Optima and Optimizers , 2014, SIAM/ASA J. Uncertain. Quantification.