L G ] 1 6 D ec 2 02 0 Squirrel : A Switching Hyperparameter Optimizer Description of the entry by AutoML . org & IOHprofiler to the NeurIPS 2020 BBO challenge

In this short note, we describe our submission to the NeurIPS 2020 BBO challenge. Motivated by the fact that different optimizers work well on different problems, our approach switches between different optimizers. Since the team names on the competition’s leaderboard were randomly generated “alliteration nicknames”, consisting of an adjective and an animal with the same initial letter, we called our approach the Switching Squirrel, or here, short, Squirrel. Our reference implementation of Squirrel is available at https://github.com/automl/Squirrel-Optimizer-BBO-NeurIPS20-automlorg.

[1]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[2]  Harold J. Kushner,et al.  A New Method of Locating the Maximum Point of an Arbitrary Multipeak Curve in the Presence of Noise , 1964 .

[3]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[4]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[5]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[6]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[7]  Andreas Krause,et al.  Information-Theoretic Regret Bounds for Gaussian Process Optimization in the Bandit Setting , 2009, IEEE Transactions on Information Theory.

[8]  D. Ginsbourger,et al.  Kriging is well-suited to parallelize optimization , 2010 .

[9]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[10]  Andreas Dengel,et al.  Meta-learning for evolutionary parameter optimization of classifiers , 2012, Machine Learning.

[11]  Luís Torgo,et al.  OpenML: networked science in machine learning , 2014, SKDD.

[12]  Frank Hutter,et al.  Initializing Bayesian Hyperparameter Optimization via Meta-Learning , 2015, AAAI.

[13]  Amer Draa,et al.  A sinusoidal differential evolution algorithm for numerical optimisation , 2015, Appl. Soft Comput..

[14]  Robert G. Reynolds,et al.  An ensemble sinusoidal parameter adaptation incorporated with L-SHADE for solving CEC2014 benchmark problems , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[15]  Marius Lindauer,et al.  BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters , 2019, ArXiv.

[16]  Thomas Bäck,et al.  IOHanalyzer: Performance Analysis for Iterative Optimization Heuristic , 2020, ArXiv.

[17]  David Salinas,et al.  A Quantile-based Approach for Hyperparameter Transfer Learning , 2020, ICML.