Sequential design for response surface model fit in computer experiments using derivative information

ABSTRACT This article considers the problem of response surface model fit in computer experiment. We propose a new sequential adaptive design through the “maximum expected improvement” approach. The new method defines the improvement by the first order approximation from the known design points using derivative information and sequentially seeks point in area with large curvature and variance. A version with distance penalty is also considered. We demonstrate their superiority over some existing methods by simulation.

[1]  William J. Welch,et al.  Computer experiments and global optimization , 1997 .

[2]  Weiwei Wang,et al.  Sequential design for computer experiments with a flexible Bayesian additive model , 2012, 1203.1078.

[3]  Peter Z. G. Qian Nested Latin hypercube designs , 2009 .

[4]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .

[5]  Tirthankar Dasgupta,et al.  Sequential Exploration of Complex Surfaces Using Minimum Energy Designs , 2015, Technometrics.

[6]  Andy J. Keane,et al.  Engineering Design via Surrogate Modelling - A Practical Guide , 2008 .

[7]  T. J. Mitchell,et al.  Exploratory designs for computational experiments , 1995 .

[8]  Runze Li,et al.  Design and Modeling for Computer Experiments , 2005 .

[9]  Ruichen Jin,et al.  On Sequential Sampling for Global Metamodeling in Engineering Design , 2002, DAC 2002.

[10]  Robert B. Gramacy,et al.  Particle Learning of Gaussian Process Models for Sequential Design and Optimization , 2009, 0909.5262.

[11]  N. Zheng,et al.  Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models , 2006, J. Glob. Optim..

[12]  William I. Notz,et al.  Sequential adaptive designs in computer experiments for response surface model fit , 2008 .

[13]  David A. Cohn,et al.  Neural Network Exploration Using Optimal Experiment Design , 1993, NIPS.

[14]  V. Roshan Joseph,et al.  Limit Kriging , 2006, Technometrics.

[15]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[16]  Chien-Yu Peng,et al.  On the Choice of Nugget in Kriging Modeling for Deterministic Computer Experiments , 2014 .

[17]  Herbert K. H. Lee,et al.  Bayesian Guided Pattern Search for Robust Local Optimization , 2009, Technometrics.

[18]  M. E. Johnson,et al.  Minimax and maximin distance designs , 1990 .

[19]  Robert B. Gramacy,et al.  Ja n 20 08 Bayesian Treed Gaussian Process Models with an Application to Computer Modeling , 2009 .

[20]  Daniel W. Apley,et al.  Local Gaussian Process Approximation for Large Computer Experiments , 2013, 1303.0383.

[21]  David Mease,et al.  Latin Hyperrectangle Sampling for Computer Experiments , 2006, Technometrics.

[22]  Robert B. Gramacy,et al.  Adaptive Design and Analysis of Supercomputer Experiments , 2008, Technometrics.

[23]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[24]  Jon Lee Maximum entropy sampling , 2001 .

[25]  Donald R. Jones,et al.  A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..