Deep Gaussian Process metamodeling of sequentially sampled non-stationary response surfaces

Simulations are often used for the design of complex systems as they allow to explore the design space without the need to build several prototypes. Over the years, the simulation accuracy, as well as the associated computational cost has increased significantly, limiting the overall number of simulations during the design process. Therefore, metamodeling aims to approximate the simulation response with a cheap-to-evaluate mathematical approximation, learned from a limited set of simulator evaluations. Kernel-based methods using stationary kernels are nowadays wildly used. However, using stationary kernels for non-stationary responses can be inappropriate and result in poor models when combined with sequential design. We present the application of a novel kernel-based technique, known as Deep Gaussian Processes, which is better able to cope with these difficulties. We evaluate the method for non-stationary regression on a series of real-world problems, showing that it outperforms the standard Gaussian Processes with stationary kernels.

[1]  Jeong‐Soo Park Optimal Latin-hypercube designs for computer experiments , 1994 .

[2]  Lehel Csató,et al.  Sparse On-Line Gaussian Processes , 2002, Neural Computation.

[3]  Robert Lehmensiek,et al.  Adaptive sampling applied to multivariate, multiple output rational interpolation models with application to microwave circuits , 2002 .

[4]  Neil D. Lawrence,et al.  Fast Forward Selection to Speed Up Sparse Gaussian Process Regression , 2003, AISTATS.

[5]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[6]  Daniel J. Fonseca,et al.  Simulation metamodeling through artificial neural networks , 2003 .

[7]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[8]  Zoubin Ghahramani,et al.  Sparse Gaussian Processes using Pseudo-inputs , 2005, NIPS.

[9]  LI X.RONG,et al.  Evaluation of estimation algorithms part I: incomprehensive measures of performance , 2006, IEEE Transactions on Aerospace and Electronic Systems.

[10]  Robert B. Gramacy,et al.  Ja n 20 08 Bayesian Treed Gaussian Process Models with an Application to Computer Modeling , 2009 .

[11]  Jack P. C. Kleijnen Design and Analysis of Simulation Experiments , 2007 .

[12]  Andy J. Keane,et al.  Engineering Design via Surrogate Modelling - A Practical Guide , 2008 .

[13]  Dirk Gorissen,et al.  Sequential modeling of a low noise amplifier with neural networks and active learning , 2009, Neural Computing and Applications.

[14]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[15]  Michalis K. Titsias,et al.  Variational Learning of Inducing Variables in Sparse Gaussian Processes , 2009, AISTATS.

[16]  Dirk Gorissen,et al.  Grid-enabled adaptive surrogate modeling for computer aided engineering , 2010 .

[17]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .

[18]  Piet Demeester,et al.  A Surrogate Modeling and Adaptive Sampling Toolbox for Computer Based Design , 2010, J. Mach. Learn. Res..

[19]  Dirk Gorissen,et al.  A Novel Hybrid Sequential Design Strategy for Global Surrogate Modeling of Computer Experiments , 2011, SIAM J. Sci. Comput..

[20]  Neil D. Lawrence,et al.  Deep Gaussian Processes , 2012, AISTATS.

[21]  Neil D. Lawrence,et al.  Nested Variational Compression in Deep Gaussian Processes , 2014, 1412.1370.

[22]  Tom Dhaene,et al.  A Fuzzy Hybrid Sequential Design Strategy for Global Surrogate Modeling of High-Dimensional Computer Experiments , 2015, SIAM J. Sci. Comput..

[23]  Daniel Hernández-Lobato,et al.  Deep Gaussian Processes for Regression using Approximate Expectation Propagation , 2016, ICML.

[24]  Alexis Boukouvalas,et al.  GPflow: A Gaussian Process Library using TensorFlow , 2016, J. Mach. Learn. Res..