Importance Gaussian Quadrature

Importance sampling (IS) and numerical integration methods are usually employed for approximating moments of complicated target distributions. In its basic procedure, the IS methodology randomly draws samples from a proposal distribution and weights them accordingly, accounting for the mismatch between the target and proposal. In this work, we present a general framework of numerical integration techniques inspired by the IS methodology. The framework can also be seen as an incorporation of deterministic rules into IS methods, reducing the error of the estimators by several orders of magnitude in several problems of interest. The proposed approach extends the range of applicability of the Gaussian quadrature rules. For instance, the IS perspective allows us to use Gauss-Hermite rules in problems where the integrand is not involving a Gaussian distribution, and even more, when the integrand can only be evaluated up to a normalizing constant, as it is usually the case in Bayesian inference. The novel perspective makes use of recent advances on the multiple IS (MIS) and adaptive (AIS) literatures, and incorporates it to a wider numerical integration framework that combines several numerical integration rules that can be iteratively adapted. We analyze the convergence of the algorithms and provide some representative examples showing the superiority of the proposed approach in terms of performance.

[1]  Mónica F. Bugallo,et al.  Heretical Multiple Importance Sampling , 2016, IEEE Signal Processing Letters.

[2]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[3]  Gene H. Golub,et al.  Calculation of Gauss quadrature rules , 1967, Milestones in Matrix Computation.

[4]  A. S. Kronrod,et al.  Nodes and weights of quadrature formulas : sixteen-place tables , 1965 .

[5]  Ángel F. García-Fernández,et al.  Gaussian Process Classification Using Posterior Linearization , 2018, IEEE Signal Processing Letters.

[6]  Pau Closas,et al.  Multiple Quadrature Kalman Filtering , 2012, IEEE Transactions on Signal Processing.

[7]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[8]  Mónica F. Bugallo,et al.  Multiple importance sampling with overlapping sets of proposals , 2016, 2016 IEEE Statistical Signal Processing Workshop (SSP).

[9]  O. Lahav,et al.  exofit: orbital parameters of extrasolar planets from radial velocities , 2008, 0805.3532.

[10]  Hugh F. Durrant-Whyte,et al.  A new method for the nonlinear transformation of means and covariances in filters and estimators , 2000, IEEE Trans. Autom. Control..

[11]  Florian Heiss,et al.  Likelihood approximation by numerical integration on sparse grids , 2008 .

[12]  Jukka Corander,et al.  An Adaptive Population Importance Sampler: Learning From Uncertainty , 2015, IEEE Transactions on Signal Processing.

[13]  A. Owen,et al.  Safe and Effective Importance Sampling , 2000 .

[14]  Jeffrey K. Uhlmann,et al.  Unscented filtering and nonlinear estimation , 2004, Proceedings of the IEEE.

[15]  Arnold Neumaier,et al.  Introduction to Numerical Analysis , 2001 .

[16]  Luca Martino,et al.  The Recycling Gibbs sampler for efficient learning , 2016, Digit. Signal Process..

[17]  Luca Martino,et al.  Improving population Monte Carlo: Alternative weighting and resampling schemes , 2016, Signal Process..

[18]  Lei Liu,et al.  The use of Gaussian quadrature for estimation in frailty proportional hazards models , 2008, Statistics in medicine.

[19]  Jean-Michel Marin,et al.  Adaptive importance sampling in general mixture classes , 2007, Stat. Comput..

[20]  Walter Gautschi,et al.  Gaussian quadrature involving Einstein and Fermi functions with an application to summation of series , 1985 .

[21]  Pau Closas,et al.  Computational complexity reduction techniques for quadrature Kalman filters , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[22]  Bernie D. Shizgal,et al.  A Gaussian quadrature procedure for use in the solution of the Boltzmann equation and related problems , 1981 .

[23]  Luca Martino,et al.  Effective sample size for importance sampling based on discrepancy measures , 2016, Signal Process..

[24]  R. Caflisch Monte Carlo and quasi-Monte Carlo methods , 1998, Acta Numerica.

[25]  A. Schaeffer Inequalities of A. Markoff and S. Bernstein for polynomials and related functions , 1941 .

[26]  Pau Closas,et al.  Gauss-Hermite Quadrature for non-Gaussian Inference via an Importance Sampling Interpretation , 2019, 2019 27th European Signal Processing Conference (EUSIPCO).

[27]  Harald Niederreiter,et al.  Random number generation and Quasi-Monte Carlo methods , 1992, CBMS-NSF regional conference series in applied mathematics.

[28]  H. Sorenson,et al.  Recursive bayesian estimation using gaussian sums , 1971 .

[29]  Petar M. Djuric,et al.  Adaptive Importance Sampling: The past, the present, and the future , 2017, IEEE Signal Processing Magazine.

[30]  R. Douc,et al.  Minimum variance importance sampling via Population Monte Carlo , 2007 .

[31]  Paul Kabaila,et al.  On Adaptive Gauss-Hermite Quadrature for Estimation in GLMM’s , 2019, Communications in Computer and Information Science.

[32]  P. L’Ecuyer,et al.  Random Number Generation and Quasi-Monte Carlo† , 2015 .

[33]  Jukka Corander,et al.  Layered adaptive importance sampling , 2015, Statistics and Computing.

[34]  Robert J. Elliott,et al.  Discrete-Time Nonlinear Filtering Algorithms Using Gauss–Hermite Quadrature , 2007, Proceedings of the IEEE.

[35]  Qing Liu,et al.  A note on Gauss—Hermite quadrature , 1994 .

[36]  Wayne Luk,et al.  Multivariate Gaussian Random Number Generation Targeting Reconfigurable Hardware , 2008, TRETS.

[37]  M. P. Hobson,et al.  Detecting extrasolar planets from stellar radial velocities using Bayesian evidence , 2010, 1012.5129.

[38]  Kazufumi Ito,et al.  Gaussian filters for nonlinear filtering problems , 2000, IEEE Trans. Autom. Control..

[39]  T. Patterson,et al.  The optimum addition of points to quadrature formulae. , 1968 .

[40]  Simon Haykin,et al.  Square-Root Quadrature Kalman Filtering , 2008, IEEE Transactions on Signal Processing.

[41]  Robert Babuska,et al.  Parametric Bayesian Filters for Nonlinear Stochastic Dynamical Systems: A Survey , 2013, IEEE Transactions on Cybernetics.

[42]  H. Engels,et al.  Numerical Quadrature and Cubature , 1980 .

[43]  C. Robert,et al.  Rethinking the Effective Sample Size , 2018, International Statistical Review.

[44]  Ross D. Shachter,et al.  Laplace's Method Approximations for Probabilistic Inference in Belief Networks with Continuous Variables , 1994, UAI.

[45]  Dominik Ballreich,et al.  Deterministic Numerical Integration , 2017 .

[46]  George Tauchen,et al.  Quadrature-Based Methods for Obtaining Approximate Solutions to Nonlinear Asset Pricing Models , 1991 .

[47]  A. Stroud Approximate calculation of multiple integrals , 1973 .

[48]  Luca Martino,et al.  Efficient Adaptive Multiple Importance Sampling , 2019, 2019 27th European Signal Processing Conference (EUSIPCO).

[49]  O. Ore On functions with bounded derivatives , 1938 .

[50]  W. Gautschi A Survey of Gauss-Christoffel Quadrature Formulae , 1981 .

[51]  David Luengo,et al.  Generalized Multiple Importance Sampling , 2015, Statistical Science.

[52]  Art B. Owen,et al.  Quasi-Monte Carlo Sampling by , 2003, SIGGRAPH 2003.

[53]  W. Sickel,et al.  Smolyak’s Algorithm, Sampling on Sparse Grids and Function Spaces of Dominating Mixed Smoothness , 2006 .

[54]  Leonidas J. Guibas,et al.  Optimally combining sampling techniques for Monte Carlo rendering , 1995, SIGGRAPH.

[55]  Mónica F. Bugallo,et al.  Efficient Multiple Importance Sampling Estimators , 2015, IEEE Signal Processing Letters.

[56]  J. Marin,et al.  Population Monte Carlo , 2004 .

[57]  Pau Closas,et al.  Uncertainty Exchange Through Multiple Quadrature Kalman Filtering , 2016, IEEE Signal Processing Letters.

[58]  Tim Hesterberg,et al.  Monte Carlo Strategies in Scientific Computing , 2002, Technometrics.

[59]  S. Haykin,et al.  Cubature Kalman Filters , 2009, IEEE Transactions on Automatic Control.

[60]  Jean-Marie Cornuet,et al.  Adaptive Multiple Importance Sampling , 2009, 0907.1254.

[61]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[62]  D. Bates,et al.  Approximations to the Log-Likelihood Function in the Nonlinear Mixed-Effects Model , 1995 .