A Laplace Method for Under-Determined Bayesian Optimal Experimental Designs

Abstract In Long et al. (2013), a new method based on the Laplace approximation was developed to accelerate the estimation of the post-experimental expected information gains (Kullback–Leibler divergence) in model parameters and predictive quantities of interest in the Bayesian framework. A closed-form asymptotic approximation of the inner integral and the order of the corresponding dominant error term were obtained in the cases where the parameters are determined by the experiment. In this work, we extend that method to the general case where the model parameters cannot be determined completely by the data from the proposed experiments. We carry out the Laplace approximations in the directions orthogonal to the null space of the Jacobian matrix of the data model with respect to the parameters, so that the information gain can be reduced to an integration against the marginal density of the transformed parameters that are not determined by the experiments. Furthermore, the expected information gain can be approximated by an integration over the prior, where the integrand is a function of the posterior covariance matrix projected over the aforementioned orthogonal directions. To deal with the issue of dimensionality in a complex problem, we use either Monte Carlo sampling or sparse quadratures for the integration over the prior probability density function, depending on the regularity of the integrand function. We demonstrate the accuracy, efficiency and robustness of the proposed method via several nonlinear under-determined test cases. They include the designs of the scalar parameter in a one dimensional cubic polynomial function with two unidentifiable parameters forming a linear manifold, and the boundary source locations for impedance tomography in a square domain, where the unknown parameter is the conductivity, which is represented as a random field.

[1]  Creasy Problem,et al.  Reference Posterior Distributions for Bayesian Inference , 1979 .

[2]  L. Tierney,et al.  The validity of posterior expansions based on Laplace''s method , 1990 .

[3]  Bertrand Clarke,et al.  Partial information reference priors: derivation and interpretations , 2004 .

[4]  E. Somersalo,et al.  Existence and uniqueness for electrode models for electric current computed tomography , 1992 .

[5]  C. Hwang Laplace's Method Revisited: Weak Convergence of Probability Measures , 1980 .

[6]  Raul Tempone,et al.  Fast estimation of expected information gains for Bayesian experimental designs based on Laplace approximations , 2013 .

[7]  L. Tierney,et al.  Fully Exponential Laplace Approximations to Expectations and Variances of Nonpositive Functions , 1989 .

[8]  Erich Novak,et al.  High dimensional polynomial interpolation on sparse grids , 2000, Adv. Comput. Math..

[9]  S. Stigler Laplace's 1774 Memoir on Inverse Probability , 1986 .

[10]  William W. Hager,et al.  Updating the Inverse of a Matrix , 1989, SIAM Rev..

[11]  Nicholas G. Polson Bayesian perspectives on statistical modelling , 1988 .

[12]  Larry Wasserman,et al.  Noninformative priors and nuisance parameters , 1993 .

[13]  Nicholas G. Polson On the Expected Amount of Information from a Non‐Linear Model , 1992 .

[14]  Small Noise Asymptotics of the Bayesian Estimator in Nonidentifiable Models , 2002 .

[15]  M. Spivak A comprehensive introduction to differential geometry , 1979 .

[16]  K. Chaloner,et al.  Bayesian Experimental Design: A Review , 1995 .

[17]  Ghosal Subhashis,et al.  EXPANSION OF BAYES RISK FOR ENTROPY LOSS AND REFERENCE PRIOR IN NONREGULAR CASES , 1997 .

[18]  Josep Ginebra,et al.  On the measure of the information in a statistical experiment , 2007 .

[19]  Fabio Nobile,et al.  A Sparse Grid Stochastic Collocation Method for Partial Differential Equations with Random Input Data , 2008, SIAM J. Numer. Anal..