Perturbative corrections for approximate inference in Gaussian latent variable models

Expectation Propagation (EP) provides a framework for approximate inference. When the model under consideration is over a latent Gaussian field, with the approximation being Gaussian, we show how these approximations can systematically be corrected. A perturbative expansion is made of the exact but intractable correction, and can be applied to the model's partition function and other moments of interest. The correction is expressed over the higher-order cumulants which are neglected by EP's local matching of moments. Through the expansion, we see that EP is correct to first order. By considering higher orders, corrections of increasing polynomial complexity can be applied to the approximation. The second order provides a correction in quadratic time, which we apply to an array of Gaussian process and Ising models. The corrections generalize to arbitrarily complex approximating families, which we illustrate on tree-structured Ising model approximations. Furthermore, they provide a polynomial-time assessment of the approximation error. We also provide both theoretical and practical insights on the exactness of the EP solution.

[1]  Andrew Gelfand,et al.  A Cluster-Cumulant Expansion at the Fixed Points of Belief Propagation , 2012, UAI.

[2]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[3]  R. Palmer,et al.  Solution of 'Solvable model of a spin glass' , 1977 .

[4]  Erik B. Sudderth,et al.  Loop Series and Bethe Variational Bounds in Attractive Graphical Models , 2007, NIPS.

[5]  Tom Heskes,et al.  Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior , 2010, NeuroImage.

[6]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[7]  Matthias W. Seeger,et al.  Fast Convergent Algorithms for Expectation Propagation Approximate Bayesian Inference , 2010, AISTATS.

[8]  Yuan Qi,et al.  Tree-structured Approximations by Expectation Propagation , 2003, NIPS.

[9]  Kevin P. Murphy,et al.  Machine learning - a probabilistic perspective , 2012, Adaptive computation and machine learning series.

[10]  S. Blinnikov,et al.  Expansions for nearly Gaussian distributions , 1997 .

[11]  Ole Winther,et al.  Gaussian Processes for Classification: Mean-Field Algorithms , 2000, Neural Computation.

[12]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[13]  H. Rue,et al.  Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations , 2009 .

[14]  Ole Winther,et al.  Expectation Consistent Approximate Inference , 2005, J. Mach. Learn. Res..

[15]  David Barber,et al.  Bayesian reasoning and machine learning , 2012 .

[16]  Tom Minka,et al.  A family of algorithms for approximate Bayesian inference , 2001 .

[17]  Ole Winther,et al.  Improving on Expectation Propagation , 2008, NIPS.

[18]  Michael Chertkov,et al.  Loop series for discrete statistical models on graphs , 2006, ArXiv.

[19]  S. Kirkpatrick,et al.  Solvable Model of a Spin-Glass , 1975 .

[20]  John P. Boyd,et al.  The Devil's Invention: Asymptotic, Superasymptotic and Hyperasymptotic Series , 1999 .

[21]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[22]  Ole Winther,et al.  Perturbation Corrections in Approximate Inference: Mixture Modelling Applications , 2009, J. Mach. Learn. Res..

[23]  Martin J. Wainwright,et al.  Log-determinant relaxation for approximate inference in discrete Markov random fields , 2006, IEEE Transactions on Signal Processing.

[24]  Tom Heskes,et al.  Approximate Marginals in Latent Gaussian Models , 2011, J. Mach. Learn. Res..

[25]  Carl E. Rasmussen,et al.  Assessing Approximate Inference for Binary Gaussian Process Classification , 2005, J. Mach. Learn. Res..