Gauss-Hermite Quadrature for non-Gaussian Inference via an Importance Sampling Interpretation

Intractable integrals appear in a plethora of problems in science and engineering. Very often, such integrals involve also a targeted distribution which is not even available in a closed form. In both cases, approximations of the integrals must be performed. Monte Carlo (MC) methods are a usual way of tackling the problem by approximating the integral with random samples. Quadrature methods are another alternative, where the integral is approximated with deterministic points and weights. However, the choice of these points and weights is only possible in a selected number of families of distributions. In this paper, we propose a deterministic method inspired in MC for approximating generic integrals. Our method is derived via an importance sampling (IS) interpretation, a MC methodology where the samples are simulated from the so-called proposal density, and weighted properly. We use Gauss-Hermite quadrature rules for Gaussian distributions, transforming them for approximating integrals with respect to generic distributions, even in the case where its normalizing constant is unknown. The novel method allows the use of several proposal distributions, allowing for the incorporation of recent advances in the multiple IS (MIS) literature. We discuss the convergence of the method, and we illustrate its performance with two numerical examples.

[1]  Jean-Michel Marin,et al.  Adaptive importance sampling in general mixture classes , 2007, Stat. Comput..

[2]  Luca Martino,et al.  Improving population Monte Carlo: Alternative weighting and resampling schemes , 2016, Signal Process..

[3]  W. Gautschi A Survey of Gauss-Christoffel Quadrature Formulae , 1981 .

[4]  David Luengo,et al.  Generalized Multiple Importance Sampling , 2015, Statistical Science.

[5]  Dominik Ballreich,et al.  Deterministic Numerical Integration , 2017 .

[6]  Pau Closas,et al.  Uncertainty Exchange Through Multiple Quadrature Kalman Filtering , 2016, IEEE Signal Processing Letters.

[7]  Pau Closas,et al.  Computational complexity reduction techniques for quadrature Kalman filters , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[8]  Petar M. Djuric,et al.  Adaptive Importance Sampling: The past, the present, and the future , 2017, IEEE Signal Processing Magazine.

[9]  O. Cappé,et al.  Population Monte Carlo , 2004 .

[10]  Ming Xin,et al.  Sparse-grid quadrature nonlinear filtering , 2012, Autom..

[11]  Robert J. Elliott,et al.  Discrete-Time Nonlinear Filtering Algorithms Using Gauss–Hermite Quadrature , 2007, Proceedings of the IEEE.

[12]  Han Chongzhao Square-Root Quadrature Kalman Filter , 2009 .

[13]  Peter Jäckel A note on multivariate Gauss-Hermite quadrature , 2005 .

[14]  Robert Babuska,et al.  Parametric Bayesian Filters for Nonlinear Stochastic Dynamical Systems: A Survey , 2013, IEEE Transactions on Cybernetics.

[15]  G. D. Byrne,et al.  Gaussian Quadratures for the Integrals ∞ 0 exp(-x 2 )f(x)dx and b 0 exp(-x 2 )f(x)dx , 1969 .

[16]  H. Sorenson,et al.  Recursive bayesian estimation using gaussian sums , 1971 .

[17]  Kazufumi Ito,et al.  Gaussian filters for nonlinear filtering problems , 2000, IEEE Trans. Autom. Control..

[18]  Simon Haykin,et al.  Square-Root Quadrature Kalman Filtering , 2008, IEEE Transactions on Signal Processing.

[19]  Jun S. Liu,et al.  Monte Carlo strategies in scientific computing , 2001 .

[20]  Pau Closas,et al.  Multiple Quadrature Kalman Filtering , 2012, IEEE Transactions on Signal Processing.

[21]  Geoffrey J. McLachlan,et al.  Finite Mixture Models , 2019, Annual Review of Statistics and Its Application.