Sequential Model-Based Parameter Optimisation: an Experimental Investigation of Automated and Inte

This work experimentally investigates model-based approaches for optimizing the performance of parameterized randomized algorithms. Such approaches build a response surface model and use this model for finding good parameter settings of the given algorithm. We evaluated two methods from the literature that are based on Gaussian process models: sequential parameter optimization (SPO) (Bartz-Beielstein et al, 2005) and sequential Kriging optimization (SKO) (Huang et al, 2006). SPO performed better “out-of-the-box,” whereas SKO was competitive when response values were log transformed. We then investigated key design decisions within the SPO paradigm, characterizing the performance consequences of each. Based on these findings, we propose a new version of SPO, dubbed SPO+, which extends SPO with a novel intensification procedure and a log-transformed objective function. In a domain for which performance results for other (model-free) parameter optimization approaches are available, we demonstrate that SPO+ achieves state-of-the-art performance. Finally, we compare this automated parameter tuning approach to an interactive, manual process that makes use of classical regression techniques. This interactive approach is particularly useful when only a relatively small number of parameter configurations can be evaluated. Because it can relatively quickly draw attention to important parameters and parameter interactions, it can help experts gain insights into the parameter response of a given algorithm and identify reasonable parameter settings.

[1]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.

[2]  G. Box,et al.  Some New Three Level Designs for the Study of Quantitative Variables , 1960 .

[3]  Robert E. Shannon,et al.  Design and analysis of simulation experiments , 1978, WSC '78.

[4]  J. S. Hunter,et al.  Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building. , 1979 .

[5]  Derek J. Pike,et al.  Empirical Model‐building and Response Surfaces. , 1988 .

[6]  G. Box,et al.  Empirical Model-Building and Response Surfaces. , 1990 .

[7]  F. Pukelsheim Optimal Design of Experiments , 1993 .

[8]  Hans-Paul Schwefel,et al.  Evolution and optimum seeking , 1995, Sixth-generation computer technology series.

[9]  Ross Ihaka,et al.  Gentleman R: R: A language for data analysis and graphics , 1996 .

[10]  Nikolaus Hansen,et al.  Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[11]  Donald R. Jones,et al.  Global versus local search in constrained optimization of computer models , 1998 .

[12]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[13]  J. R. Koehler,et al.  Modern Applied Statistics with S-Plus. , 1996 .

[14]  Thomas J. Santner,et al.  Sequential design of computer experiments to minimize integrated response functions , 2000 .

[15]  George C. Runger,et al.  Using Experimental Design to Find Effective Parameter Settings for Heuristics , 2001, J. Heuristics.

[16]  Hans-Georg Beyer,et al.  The Theory of Evolution Strategies , 2001, Natural Computing Series.

[17]  P. A. Newman,et al.  Approximation and Model Management in Aerodynamic Optimization with Variable-Fidelity Models , 2001 .

[18]  Holger H. Hoos,et al.  Scaling and Probabilistic Smoothing: Efficient Dynamic Local Search for SAT , 2002, CP.

[19]  Peter Dalgaard,et al.  Introductory statistics with R , 2002, Statistics and computing.

[20]  Yoav Shoham,et al.  Learning the Empirical Hardness of Optimization Problems: The Case of Combinatorial Auctions , 2002, CP.

[21]  Thomas Stützle,et al.  A Racing Algorithm for Configuring Metaheuristics , 2002, GECCO.

[22]  Ramana V. Grandhi,et al.  Improved Distributed Hypercube Sampling , 2002 .

[23]  Cass T. Miller,et al.  Optimal design for problems involving flow and transport phenomena in saturated subsurface systems , 2002 .

[24]  Søren Nymand Lophaven,et al.  Aspects of the Matlab toolbox DACE , 2002 .

[25]  Y. Shoham,et al.  Resource allocation in competitive multiagent systems , 2003 .

[26]  Thomas Bartz-Beielstein,et al.  Experimental Analysis of Evolution Strategies - Overview and Comprehensive Introduction , 2003 .

[27]  Chun-Hung Chen,et al.  Optimal Computing Budget Allocation of Indifference-zone-selection Procedures , 2003 .

[28]  Holger H. Hoos,et al.  UBCSAT: An Implementation and Experimentation Environment for SLS Algorithms for SAT & MAX-SAT , 2004, SAT.

[29]  Charles Audet,et al.  Finding Optimal Algorithmic Parameters Using the Mesh Adaptive Direct Search Algorithm , 2004 .

[30]  Thomas Stützle,et al.  Stochastic Local Search: Foundations & Applications , 2004 .

[31]  Thomas Bartz-Beielstein,et al.  Analysis of Particle Swarm Optimization Using Computational Statistics , 2004 .

[32]  Thomas Bartz-Beielstein,et al.  Tuning search algorithms for real-world applications: a regression tree based approach , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[33]  Thomas Bartz-Beielstein,et al.  Design and Analysis of Optimization Algorithms Using Computational Statistics , 2004 .

[34]  Thomas Bartz-Beielstein,et al.  Designing particle swarm optimization with regression trees , 2004 .

[35]  Nikolaus Hansen,et al.  Evaluating the CMA Evolution Strategy on Multimodal Test Functions , 2004, PPSN.

[36]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[37]  N. Zheng,et al.  Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models , 2006, J. Glob. Optim..

[38]  Thomas Bartz-Beielstein,et al.  Experimental Research in Evolutionary Computation - The New Experimentalism , 2010, Natural Computing Series.

[39]  Thomas Bartz-Beielstein,et al.  Considerations of Budget Allocation for Sequential Parameter Optimization (SPO) , 2006 .

[40]  Kevin Leyton-Brown,et al.  Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms , 2006, CP.

[41]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[42]  Manuel Laguna,et al.  Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search , 2006, Oper. Res..

[43]  Thomas Stützle,et al.  Improvement Strategies for the F-Race Algorithm: Sampling Design and Iterative Refinement , 2007, Hybrid Metaheuristics.

[44]  J. Weston,et al.  Approximation Methods for Gaussian Process Regression , 2007 .

[45]  Thomas Stützle,et al.  Automatic Algorithm Configuration Based on Local Search , 2007, AAAI.

[46]  F. Hutter,et al.  ParamILS: An Automatic Algorithm Configuration Framework , 2009, J. Artif. Intell. Res..

[47]  Thomas Bartz-Beielstein,et al.  Optimierte Modellierung von Füllständen in Regenüberlaufbecken mittels CI-basierter ParameterselektionOptimized Modelling of Fill Levels in Stormwater Tanks Using CI-based Parameter Selection Schemes , 2009, Autom..

[48]  Kevin P. Murphy,et al.  An experimental investigation of model-based parameter optimisation: SPO and beyond , 2009, GECCO.

[49]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .