Dimension reduction for emulation: application to the influence of bathymetry on tsunami heights

High accuracy complex computer models, also called simulators, require large resources in time and memory to produce realistic results. Statistical emulators are computationally cheap approximations of such simulators. They can be built to replace simulators for various purposes, such as the propagation of uncertainties from inputs to outputs or the calibration of some internal parameters against observations. However, when the input space is of high dimension, the construction of an emulator can become prohibitively expensive. In this paper, we introduce a joint framework merging emulation with dimension reduction in order to overcome this hurdle. The gradient-based kernel dimension reduction technique is chosen due to its ability to extract drastically lower dimensions with little loss in information. The Gaussian Process emulation technique is combined with this dimension reduction approach. Theoretical properties of the approximation are explored. We demonstrate both efficiency and accuracy theoretically and on an elliptic PDE. We finally present a realistic application to tsunami modeling. The uncertainties in the sea-floor elevation (bathymetry) are modeled as high-dimensional realizations of a spatial process using the INLA-SPDE approach. Our dimension-reduced emulation enables us to compute the impact of these uncertainties on resulting possible tsunami wave heights near-shore and on-shore. Considering an uncertain earthquake source, we observe a significant increase in the spread of uncertainties in the tsunami heights due to the contribution of the bathymetry uncertainties to the overall uncertainty budget. These results highlight the need to reduce uncertainties in the bathymetry in early warnings and hazard assessments.

[1]  Bertrand Iooss,et al.  Global sensitivity analysis of computer models with functional inputs , 2008, Reliab. Eng. Syst. Saf..

[2]  Ker-Chau Li,et al.  Sliced Inverse Regression for Dimension Reduction , 1991 .

[3]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[4]  Ker-Chau Li Sliced inverse regression for dimension reduction (with discussion) , 1991 .

[5]  A. OHagan,et al.  Bayesian analysis of computer code outputs: A tutorial , 2006, Reliab. Eng. Syst. Saf..

[6]  B. Li,et al.  On a Projective Resampling Method for Dimension Reduction With Multivariate Responses , 2008 .

[7]  Bertrand Iooss,et al.  Global sensitivity analysis of stochastic computer models with joint metamodels , 2008, Statistics and Computing.

[8]  Serge Guillas,et al.  Sequential Design with Mutual Information for Computer Experiments (MICE): Emulation of a Tsunami Model , 2014, SIAM/ASA J. Uncertain. Quantification.

[9]  K. Fukumizu,et al.  Gradient-Based Kernel Dimension Reduction for Regression , 2014 .

[10]  D. Gleich,et al.  Computing active subspaces with Monte Carlo , 2014, 1408.0545.

[11]  H. Rue,et al.  An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach , 2011 .

[12]  Denys Dutykh,et al.  The VOLNA code for the numerical modeling of tsunami waves: Generation, propagation and inundation , 2010, 1002.4553.

[13]  R. Cook On the Interpretation of Regression Plots , 1994 .

[14]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[15]  Ming-Jun Lai,et al.  Efficient Spatial Modeling Using the SPDE Approach With Bivariate Splines , 2015, 1503.03761.

[16]  Bernhard Schölkopf,et al.  A Kernel Two-Sample Test , 2012, J. Mach. Learn. Res..

[17]  Michael I. Jordan,et al.  Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces , 2004, J. Mach. Learn. Res..

[18]  M. Wainwright,et al.  Sampled forms of functional PCA in reproducing kernel Hilbert spaces , 2011, 1109.3336.

[19]  Gene H. Golub,et al.  Matrix computations , 1983 .

[20]  Darren J. Wilkinson,et al.  Bayesian Emulation and Calibration of a Stochastic Computer Model of Mitochondrial DNA Deletions in Substantia Nigra Neurons , 2009 .

[21]  R. Christensen,et al.  Fisher Lecture: Dimension Reduction in Regression , 2007, 0708.3774.

[22]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[23]  Kofi P. Adragni,et al.  ldr: An R Software Package for Likelihood-Based Su?cient Dimension Reduction , 2014 .

[24]  H. Tong,et al.  Article: 2 , 2002, European Financial Services Law.

[25]  Qiqi Wang,et al.  Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces , 2013, SIAM J. Sci. Comput..

[26]  H. Rue,et al.  Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations , 2009 .