Approximate inference by Markov chains on union spaces

A standard method for approximating averages in probabilistic models is to construct a Markov chain in the product space of the random variables with the desired equilibrium distribution. Since the number of configurations in this space grows exponentially with the number of random variables we often need to represent the distribution with samples. In this paper we show that if one is interested in averages over single variables only, an alternative Markov chain defined on the much smaller "union space", which can be evolved exactly, becomes feasible. The transition kernel of this Markov chain is based on conditional distributions for pairs of variables and we present ways to approximate them using approximate inference algorithms such as mean field, factorized neighbors and belief propagation. Robustness to these approximations and error bounds on the estimates follow from stability analysis for Markov chains. We also present ideas on a new class of algorithms that iterate between increasingly accurate estimates for conditional and marginal distributions. Experiments validate the proposed methods.

[1]  Yee Whye Teh,et al.  The Unified Propagation and Scaling Algorithm , 2001, NIPS.

[2]  P. Green,et al.  Trans-dimensional Markov chain Monte Carlo , 2000 .

[3]  W. Freeman,et al.  Generalized Belief Propagation , 2000, NIPS.

[4]  Hilbert J. Kappen,et al.  Bound Propagation , 2003, J. Artif. Intell. Res..

[5]  Ilse C. F. Ipsen,et al.  Uniform Stability of Markov Chains , 1994, SIAM J. Matrix Anal. Appl..

[6]  John Odentrantz,et al.  Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues , 2000, Technometrics.