Using sparse kernels to design computer experiments with tunable precision

Statistical methods to design computer experiments usually rely on a Gaussian process (GP) surrogate model, and typically aim at selecting design points (combinations of algorithmic and model parameters) that minimize the average prediction variance, or maximize the prediction accuracy for the hyperparameters of the GP surrogate. In many applications, experiments have a tunable precision, in the sense that one software parameter controls the tradeoff between accuracy and computing time (e.g., mesh size in FEM simulations or number of Monte-Carlo samples). We formulate the problem of allocating a budget of computing time over a finite set of candidate points for the goals mentioned above. This is a continuous optimization problem, which is moreover convex whenever the tradeoff function accuracy vs. computing time is concave. On the other hand, using non-concave weight functions can help to identify sparse designs. In addition, using sparse kernel approximations drastically reduce the cost per iteration of the multiplicative weights updates that can be used to solve this problem.

[1]  L. A delimitation of the support of optimal designs for Kiefer ’ s φ p-class of criteria , 2013 .

[2]  Yaming Yu Monotonic convergence of a general algorithm for computing optimal designs , 2009, 0905.2646.

[3]  F. Pukelsheim Optimal Design of Experiments , 1993 .

[4]  David J. Fleet,et al.  Efficient Optimization for Sparse Gaussian Process Regression , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Andrej Pázman,et al.  Foundations of Optimum Experimental Design , 1986 .

[6]  Jürgen Pilz,et al.  Spatial sampling design and covariance-robust minimax prediction based on convex design ideas , 2010 .

[7]  J. Kiefer,et al.  The Equivalence of Two Extremum Problems , 1960, Canadian Journal of Mathematics.

[8]  Luc Pronzato,et al.  Optimal Design for Prediction in Random Field Models via Covariance Kernel Expansions , 2016 .

[9]  Matthias W. Seeger,et al.  Using the Nyström Method to Speed Up Kernel Machines , 2000, NIPS.

[10]  Jürgen Pilz,et al.  Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields , 2015, Front. Environ. Sci..

[11]  Milan Stehlík,et al.  Compound optimal spatial designs , 2009 .

[12]  Valerii Fedorov,et al.  16 Design of spatial experiments: Model fitting and prediction , 1996 .

[13]  R. Mathar,et al.  On a class of algorithms from experimental design theory , 1992 .

[14]  Michalis K. Titsias,et al.  Variational Learning of Inducing Variables in Sparse Gaussian Processes , 2009, AISTATS.

[15]  Ling Li,et al.  Sequential design of computer experiments for the estimation of a probability of failure , 2010, Statistics and Computing.

[16]  Luc Pronzato,et al.  Design of computer experiments: space filling and beyond , 2011, Statistics and Computing.

[17]  S. Silvey,et al.  An algorithm for optimal designs on a design space , 1978 .

[18]  Andrew Gordon Wilson,et al.  Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP) , 2015, ICML.

[19]  Peter Deuflhard,et al.  Adaptive Numerical Solution of PDEs , 2012 .

[20]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[21]  Carl E. Rasmussen,et al.  A Unifying View of Sparse Approximate Gaussian Process Regression , 2005, J. Mach. Learn. Res..

[22]  Luc Pronzato,et al.  Spectral Approximation of the IMSE Criterion for Optimal Designs in Kernel-Based Interpolation Models , 2014, SIAM/ASA J. Uncertain. Quantification.

[23]  Jürgen Pilz,et al.  Bayesian estimation and experimental design in linear regression models , 1992 .