The Surrogate Model

This Chapter presents the first key component of BO, that is, the probabilistic surrogate model. Section 3.1 is focused on Gaussian processes (GPs); Sect. 3.2 introduces the sequential optimization method known as Thompson sampling, also based on GP; finally, Sect. 3.3 presents other probabilistic models which might represent, in some cases, a suitable alternative to GP.

[1]  Thomas B. Schön,et al.  On the construction of probabilistic Newton-type algorithms , 2017, 2017 IEEE 56th Annual Conference on Decision and Control (CDC).

[2]  W. R. Thompson ON THE LIKELIHOOD THAT ONE UNKNOWN PROBABILITY EXCEEDS ANOTHER IN VIEW OF THE EVIDENCE OF TWO SAMPLES , 1933 .

[3]  Antanas Zilinskas,et al.  Selection of a covariance function for a Gaussian random field aimed for modeling global optimization problems , 2019, Optim. Lett..

[4]  Andrew Gordon Wilson,et al.  Scaling Gaussian Process Regression with Derivatives , 2018, NeurIPS.

[5]  José Miguel Hernández-Lobato,et al.  Quantifying mismatch in Bayesian optimization , 2016, NIPS 2016.

[6]  Kirthevasan Kandasamy,et al.  Parallelised Bayesian Optimisation via Thompson Sampling , 2018, AISTATS.

[7]  Philipp Hennig,et al.  Fast Probabilistic Optimization from Noisy Gradients , 2013, ICML.

[8]  Michel Verhaegen,et al.  A sequential Monte Carlo approach to Thompson sampling for Bayesian optimization , 2016, 1604.00169.

[9]  Nando de Freitas,et al.  Bayesian Optimization in a Billion Dimensions via Random Embeddings , 2013, J. Artif. Intell. Res..

[10]  Yi Ouyang,et al.  Learning Unknown Markov Decision Processes: A Thompson Sampling Approach , 2017, NIPS.

[11]  Alejandro Ribeiro,et al.  Sparse Multiresolution Representations With Adaptive Kernels , 2019, IEEE Transactions on Signal Processing.

[12]  Aaron Klein,et al.  Bayesian Optimization with Robust Bayesian Neural Networks , 2016, NIPS.

[13]  Carl E. Rasmussen,et al.  Derivative Observations in Gaussian Process Models of Dynamic Systems , 2002, NIPS.

[14]  Prabhat,et al.  Scalable Bayesian Optimization Using Deep Neural Networks , 2015, ICML.

[15]  Andreas Krause,et al.  A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions , 2016, bioRxiv.

[16]  Lihong Li,et al.  An Empirical Evaluation of Thompson Sampling , 2011, NIPS.

[17]  Andreas Krause,et al.  No-regret Bayesian Optimization with Unknown Hyperparameters , 2019, J. Mach. Learn. Res..

[18]  Martin Kiefel,et al.  Quasi-Newton Methods: A New Direction , 2012, ICML.

[19]  Yoshua Bengio,et al.  Algorithms for Hyper-Parameter Optimization , 2011, NIPS.

[20]  El-Ghazali Talbi,et al.  Bayesian optimization using deep Gaussian processes with applications to aerospace system design , 2019, Optimization and Engineering.

[21]  Liang Yan,et al.  Bayesian Optimization Based on K-Optimality , 2018, Entropy.