Global optimization of expensive-to-evaluate functions: an empirical comparison of two sampling criteria

In many global optimization problems motivated by engineering applications, the number of function evaluations is severely limited by time or cost. To ensure that each of these evaluations usefully contributes to the localization of good candidates for the role of global minimizer, a stochastic model of the function can be built to conduct a sequential choice of evaluation points. Based on Gaussian processes and Kriging, the authors have recently introduced the informational approach to global optimization (IAGO) which provides a one-step optimal choice of evaluation points in terms of reduction of uncertainty on the location of the minimizers. To do so, the probability density of the minimizers is approximated using conditional simulations of the Gaussian process model behind Kriging. In this paper, an empirical comparison between the underlying sampling criterion called conditional minimizer entropy (CME) and the standard expected improvement sampling criterion (EI) is presented. Classical test functions are used as well as sample paths of the Gaussian model and an industrial application. They show the interest of the CME sampling criterion in terms of evaluation savings.

[1]  M. Sasena,et al.  Exploration of Metamodeling Sampling Criteria for Constrained Global Optimization , 2002 .

[2]  Donald Geman,et al.  An Active Testing Model for Tracking Roads in Satellite Images , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Thomas J. Santner,et al.  Sequential design of computer experiments to minimize integrated response functions , 2000 .

[4]  Lothar Thiele,et al.  A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers , 2006 .

[5]  Russell R. Barton Minimization Algorithms for Functions with Random Noise , 1984 .

[6]  A. Yaglom Correlation Theory of Stationary and Related Random Functions I: Basic Results , 1987 .

[7]  Joshua D. Knowles,et al.  ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems , 2006, IEEE Transactions on Evolutionary Computation.

[8]  John L. Lumley,et al.  Engines: An Introduction , 1999 .

[9]  J. K. Hartman Some experiments in global optimization , 1973 .

[10]  Michael L. Stein,et al.  Interpolation of spatial data , 1999 .

[11]  J. Chilès,et al.  Geostatistics: Modeling Spatial Uncertainty , 1999 .

[12]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[13]  Donald R. Jones,et al.  A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..

[14]  Eric Walter,et al.  An informational approach to the global optimization of expensive-to-evaluate functions , 2006, J. Glob. Optim..

[15]  G. Wahba Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV , 1999 .

[16]  D. Ackley A connectionist machine for genetic hillclimbing , 1987 .

[17]  J. Mockus Bayesian Approach to Global Optimization: Theory and Applications , 1989 .

[18]  Douglas C. Montgomery,et al.  Response Surface Methodology: Process and Product Optimization Using Designed Experiments , 1995 .

[19]  E. Walter,et al.  Estimating derivatives and integrals with Kriging , 2005, Proceedings of the 44th IEEE Conference on Decision and Control.

[20]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[21]  C. D. Perttunen,et al.  A computational geometric approach to feasible region division in constrained global optimization , 1991, Conference Proceedings 1991 IEEE International Conference on Systems, Man, and Cybernetics.

[22]  F. H. Branin Widely convergent method for finding multiple solutions of simultaneous nonlinear equations , 1972 .

[23]  N. Zheng,et al.  Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models , 2006, J. Glob. Optim..

[24]  David W. Corne,et al.  Properties of an adaptive archiving algorithm for storing nondominated vectors , 2003, IEEE Trans. Evol. Comput..