Efficient Bayesian Optimization Based on Parallel Sequential Random Embeddings

Bayesian optimization, which offers efficient parameter search, suffers from high computation cost if the parameters have high dimensionality because the search space expands and more trials are needed. One existing solution is an embedding method that enables the search to be restricted to a low-dimensional subspace, but this method works well only when the number of embedding dimensions closely match the that of effective dimensions, which affects the function value. However, in practical situations, the number of effective dimensions is unknown, and embedding into a low dimensional subspace to save computation cost often results in a search in a lower dimensional subspace than the effective dimensions. This study proposes a Bayesian optimization method that uses random embedding to remain efficient even if the embedded dimension is lower than the effective dimensions. By conducting parallel search in an initially low dimensional space and performing multiple cycles in which the search space is incrementally improved, the optimum solution can be efficiently found. An experiment on benchmark problems shows the effectiveness of the proposed method.

[1]  Yang Yu,et al.  Derivative-Free Optimization of High-Dimensional Non-Convex Functions by Sequential Random Embeddings , 2016, IJCAI.

[2]  Kirthevasan Kandasamy,et al.  Parallelised Bayesian Optimisation via Thompson Sampling , 2018, AISTATS.

[3]  Nando de Freitas,et al.  Bayesian Optimization in a Billion Dimensions via Random Embeddings , 2013, J. Artif. Intell. Res..

[4]  Petros Koumoutsakos,et al.  Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.

[5]  Lawrence. Davis,et al.  Handbook Of Genetic Algorithms , 1990 .

[6]  J. Mockus,et al.  The Bayesian approach to global optimization , 1989 .

[7]  Adam D. Bull,et al.  Convergence Rates of Efficient Global Optimization Algorithms , 2011, J. Mach. Learn. Res..

[8]  Nando de Freitas,et al.  A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning , 2010, ArXiv.

[9]  Kirthevasan Kandasamy,et al.  High Dimensional Bayesian Optimisation and Bandits via Additive Models , 2015, ICML.

[10]  Anne Auger,et al.  Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009 , 2010, GECCO '10.

[11]  E. Vázquez,et al.  Convergence properties of the expected improvement algorithm with fixed mean and covariance functions , 2007, 0712.3744.

[12]  Volkan Cevher,et al.  High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups , 2018, AISTATS.

[13]  Hiroyuki Toda,et al.  Multi Agent Flow Estimation Based on Bayesian Optimization with Time Delay and Low Dimensional Parameter Conversion , 2018, PRIMA.

[14]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[15]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.