Elliptical Slice Sampling with Expectation Propagation

Markov Chain Monte Carlo techniques remain the gold standard for approximate Bayesian inference, but their practical issues — including onerous runtime and sensitivity to tuning parameters — often lead researchers to use faster but typically less accurate deterministic approximations. Here we couple the fast but biased deterministic approximation offered by expectation propagation with elliptical slice sampling, a state-of-the-art MCMC method. We extend our hybrid deterministic-MCMC method to include recycled samples and analytical slices, and we rigorously prove the validity of each enhancement. Taken together, we show that these advances provide an order of magnitude gain in efficiency beyond existing state-of-the-art sampling techniques in Bayesian classification and multivariate gaussian quadrature problems.

[1]  James G. Scott,et al.  The Bayesian bridge , 2011, 1109.2279.

[2]  A. Asuncion,et al.  UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences , 2007 .

[3]  John K Kruschke,et al.  Bayesian data analysis. , 2010, Wiley interdisciplinary reviews. Cognitive science.

[4]  Ryan P. Adams,et al.  Parallel MCMC with generalized elliptical slice sampling , 2012, J. Mach. Learn. Res..

[5]  Tom Heskes,et al.  Approximate Marginals in Latent Gaussian Models , 2011, J. Mach. Learn. Res..

[6]  James Ridgway,et al.  Leave Pima Indians alone: binary regression as a benchmark for Bayesian computation , 2015, 1506.08640.

[7]  David Dunson,et al.  Recycling intermediate steps to improve Hamiltonian Monte Carlo , 2015 .

[8]  Simon Barthelmé,et al.  Bounding errors of Expectation-Propagation , 2015, NIPS.

[9]  Jiqiang Guo,et al.  Stan: A Probabilistic Programming Language. , 2017, Journal of statistical software.

[10]  Peter Kulchyski and , 2015 .

[11]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[12]  Marc Peter Deisenroth,et al.  Expectation Propagation in Gaussian Process Dynamical Systems , 2012, NIPS.

[13]  C. Rasmussen,et al.  Approximations for Binary Gaussian Process Classification , 2008 .

[14]  Michael Braun,et al.  Scalable Inference of Customer Similarities from Interactions Data Using Dirichlet Processes , 2010, Mark. Sci..

[15]  I JordanMichael,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008 .

[16]  Ryan P. Adams,et al.  Elliptical slice sampling , 2009, AISTATS.

[17]  Nicolas Dobigeon,et al.  Sampling from a multivariate Gaussian distribution truncated on a simplex: A review , 2014, 2014 IEEE Workshop on Statistical Signal Processing (SSP).

[18]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[19]  Matthias W. Seeger,et al.  Expectation Propagation for Rectified Linear Poisson Regression , 2015, ACML.

[20]  Radford M. Neal Slice Sampling , 2003, The Annals of Statistics.

[21]  Carl E. Rasmussen,et al.  Assessing Approximate Inference for Binary Gaussian Process Classification , 2005, J. Mach. Learn. Res..

[22]  Thomas Hofmann,et al.  TrueSkill™: A Bayesian Skill Rating System , 2007 .

[23]  Radford M. Neal MCMC Using Hamiltonian Dynamics , 2011, 1206.1901.

[24]  Eero P. Simoncelli,et al.  Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model , 2003, NIPS.

[25]  J. Cunningham,et al.  Gaussian Probabilities and Expectation Propagation , 2011, 1111.6832.

[26]  Thomas P. Minka,et al.  Divergence measures and message passing , 2005 .

[27]  Ari Pakman,et al.  Exact Hamiltonian Monte Carlo for Truncated Multivariate Gaussians , 2012, 1208.4118.

[28]  Tom Minka,et al.  TrueSkillTM: A Bayesian Skill Rating System , 2006, NIPS.

[29]  B. Shahbaba,et al.  Sampling Constrained Probability Distributions Using Spherical Augmentation , 2015, 1506.05936.