Closed-Form Gibbs Sampling for Graphical Models with Algebraic Constraints

Probabilistic inference in many real-world problems requires graphical models with deterministic algebraic constraints between random variables (e.g., Newtonian mechanics, Pascal's law, Ohm's law) that are known to be problematic for many inference methods such as Monte Carlo sampling. Fortunately, when such constraints are invertible, the model can be collapsed and the constraints eliminated through the well-known Jacobian-based change of variables. As our first contribution in this work, we show that a much broader class of algebraic constraints can be collapsed by leveraging the properties of a Dirac delta model of deterministic constraints. Unfortunately, the collapsing process can lead to highly piece-wise densities that pose challenges for existing probabilistic inference tools. Thus, our second contribution to address these challenges is to present a variation of Gibbs sampling that efficiently samples from these piecewise densities. The key insight to achieve this is to introduce a class of functions that (1) is sufficiently rich to approximate arbitrary models up to arbitrary precision, (2) is closed under dimension reduction (collapsing) for models with (non)linear algebraic constraints and (3) always permits one analytical integral sufficient to automatically derive closed-form conditionals for Gibbs sampling. Experiments demonstrate the proposed sampler converges at least an order of magnitude faster than existing Monte Carlo samplers.

[1]  Lei Li,et al.  Dynamic Scaled Sampling for Deterministic Constraints , 2013, AISTATS.

[2]  Gregory F. Cooper,et al.  Bayesian Belief Network Inference Using Simulation , 1987, UAI.

[3]  David Huard,et al.  PyMC: Bayesian Stochastic Modelling in Python. , 2010, Journal of statistical software.

[4]  A. Gelman,et al.  Weak convergence and optimal scaling of random walk Metropolis algorithms , 1997 .

[5]  X. Jin Factor graphs and the Sum-Product Algorithm , 2002 .

[6]  Prakash P. Shenoy,et al.  Nonlinear Deterministic Relationships in Bayesian Networks , 2005, ECSQARU.

[7]  Radford M. Neal Slice Sampling , 2003, The Annals of Statistics.

[8]  A. Doucet,et al.  Particle Markov chain Monte Carlo methods , 2010 .

[9]  Xavier Pennec,et al.  Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements , 2006, Journal of Mathematical Imaging and Vision.

[10]  Donald Geman,et al.  Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images , 1984 .

[11]  Frank D. Wood,et al.  A New Approach to Probabilistic Programming Inference , 2014, AISTATS.

[12]  Andrew Thomas,et al.  The BUGS project: Evolution, critique and future directions , 2009, Statistics in medicine.

[13]  J. Hammersley,et al.  Monte Carlo Methods , 1965 .

[14]  Radford M. Neal MCMC Using Hamiltonian Dynamics , 2011, 1206.1901.

[15]  Prakash P. Shenoy Two issues in using mixtures of polynomials for inference in hybrid Bayesian networks , 2012, Int. J. Approx. Reason..

[16]  Prakash P. Shenoy,et al.  Inference in hybrid Bayesian networks using mixtures of polynomials , 2011, Int. J. Approx. Reason..

[17]  Noah D. Goodman,et al.  Lightweight Implementations of Probabilistic Programming Languages Via Transformational Compilation , 2011, AISTATS.

[18]  Judea Pearl,et al.  Evidential Reasoning Using Stochastic Simulation of Causal Models , 1987, Artif. Intell..

[19]  A. N. Kolmogorov,et al.  Foundations of the theory of probability , 1960 .

[20]  Scott Sanner,et al.  Symbolic Variable Elimination for Discrete and Continuous Graphical Models , 2012, AAAI.