Cooling Strategies for the Moment-Generating Function in Bayesian Global Optimization

Bayesian Global Optimization algorithm is designed to optimize expensive objective functions with small evaluation budget. This algorithm employs a surrogate model and assesses the potential improvement of unseen solutions through the so-called infill-criterion. A novel infill-criterion proposed in our previous work is derived from the moment-generating function of the improvement. In contrast to other techniques, it features a continuous parameter that can be used to adjust the exploration-exploitation tradeoff smoothly. In this work, two cooling strategies (linear and exponential) are adopted to enhance the explorative behavior in the early stage of the search and the exploitative effect in the final converging stage. Moreover, the initial temperature and cooling speed are investigated on some selected multi-modal functions, showing that the good setting of those two parameters depends on the problems specifics. The proposed Bayesian optimization with cooling strategy is tested on well-known BBOB benchmark. The results shows that without tuning the initial temperature and cooling speed, the proposed approach improves the performance on a range of multi-modal functions as compared to the commonly used expected improvement criterion.

[1]  Donald R. Jones,et al.  A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..

[2]  B. Minasny,et al.  The Matérn function as a general model for soil variograms , 2005 .

[3]  N. Hansen,et al.  Real-Parameter Black-Box Optimization Benchmarking: Experimental Setup , 2010 .

[4]  Andreas Krause,et al.  Information-Theoretic Regret Bounds for Gaussian Process Optimization in the Bandit Setting , 2009, IEEE Transactions on Information Theory.

[5]  Hao Wang,et al.  A new acquisition function for Bayesian optimization based on the moment-generating function , 2017, 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[6]  M. Sasena,et al.  Exploration of Metamodeling Sampling Criteria for Constrained Global Optimization , 2002 .

[7]  Michael T. M. Emmerich,et al.  Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels , 2006, IEEE Transactions on Evolutionary Computation.

[8]  Hao Wang,et al.  Balancing risk and expected gain in kriging-based global optimization , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[9]  J. Mockus Bayesian Approach to Global Optimization: Theory and Applications , 1989 .

[10]  Yaghout Nourani,et al.  A comparison of simulated annealing cooling strategies , 1998 .

[11]  Richard J. Beckman,et al.  A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code , 2000, Technometrics.

[12]  Roger Woodard,et al.  Interpolation of Spatial Data: Some Theory for Kriging , 1999, Technometrics.

[13]  Antanas Zilinskas,et al.  A review of statistical models for global optimization , 1992, J. Glob. Optim..

[14]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[15]  Peter Auer,et al.  Using Confidence Bounds for Exploitation-Exploration Trade-offs , 2003, J. Mach. Learn. Res..

[16]  Jonas Mockus,et al.  On Bayesian Methods for Seeking the Extremum , 1974, Optimization Techniques.

[17]  Alex Smola,et al.  Kernel methods in machine learning , 2007, math/0701907.

[18]  Donald R. Jones,et al.  Global versus local search in constrained optimization of computer models , 1998 .