Decayed MCMC iltering

Filtering--estimating the state of a partially observable Markov process from a sequence of observations-is one of the most widely studied problems in control theory, AI, and computational statistics. Exact computation of the posterior distribution is generally intractable for large discrete systems and for nonlinear continuous systems, so a good deal of effort has gone into developing robust approximation algorithms. This paper describes a simple stochastic approximation algorithm for filtering called decayed MCMC. The algorithm applies Markov chain Monte Carlo sampling to the space of state trajectories using a proposal distribution that favours flips of more recent state variables. The formal analysis of the algorithm involves a generalization of standard coupling arguments for MCMC convergence. We prove that for any ergodic underlying Markov process, the convergence time of decayed MCMC with inversepolynomial decay remains bounded as the length of the observation sequence grows. We show experimentally that decayed MCMC is at least competitive with other approximation algorithms such as particle filtering.

[1]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[2]  Mark Jerrum,et al.  The Markov chain Monte Carlo method: an approach to approximate counting and integration , 1996 .

[3]  P. Diaconis,et al.  LOGARITHMIC SOBOLEV INEQUALITIES FOR FINITE MARKOV CHAINS , 1996 .

[4]  Brian D. O. Anderson,et al.  Exponential stability of filters and smoothers for Hidden Markov Models , 1997, 1997 European Control Conference (ECC).

[5]  Martin E. Dyer,et al.  Path coupling: A technique for proving rapid mixing in Markov chains , 1997, Proceedings 38th Annual Symposium on Foundations of Computer Science.

[6]  Xavier Boyen,et al.  Tractable Inference for Complex Stochastic Processes , 1998, UAI.

[7]  Kevin Murphy,et al.  Bayes net toolbox for Matlab , 1999 .

[8]  F. Martinelli Lectures on Glauber dynamics for discrete spin models , 1999 .

[9]  P. Tetali,et al.  Analyzing Glauber dynamics by comparison of Markov chains , 2000 .

[10]  W. Freeman,et al.  Generalized Belief Propagation , 2000, NIPS.

[11]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[12]  Nando de Freitas,et al.  Sequential Monte Carlo Methods in Practice , 2001, Statistics for Engineering and Information Science.

[13]  Kevin P. Murphy,et al.  The Factored Frontier Algorithm for Approximate Inference in DBNs , 2001, UAI.

[14]  Michael I. Jordan,et al.  Factorial Hidden Markov Models , 1995, Machine Learning.