Improved Particle Approximations to the Joint Smoothing Distribution Using Markov Chain Monte Carlo

Particle filtering and smoothing algorithms approximate posterior state distributions with a set of samples drawn from those distributions. Conventionally, samples from the joint smoothing distribution are generated by sequentially resampling from the particle filter results. If the number of filtering particles is high, this process is limited by computational complexity. In addition, the support of the smoothing distribution is restricted to the values which appear in the filtering approximation. In this paper, a Metropolis-Hastings sampling procedure is used to improve the efficiency of the particle smoother, achieving comparable error performance but with a lower execution time. In addition, an algorithm for approximating the joint smoothing distribution without limited support is presented, which achieves simultaneous improvements in both execution time and error. These algorithms also provide a greater degree of flexibility over existing methods, allowing a trade-off between execution time and error, controlled by the length of the Markov chains.

[1]  C. Striebel,et al.  On the maximum likelihood estimates for linear dynamic systems , 1965 .

[2]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[3]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Donald Geman,et al.  Gibbs distributions and the bayesian restoration of images , 1984 .

[5]  N. Gordon,et al.  Novel approach to nonlinear/non-Gaussian Bayesian state estimation , 1993 .

[6]  G. Kitagawa Monte Carlo Filter and Smoother for Non-Gaussian Nonlinear State Space Models , 1996 .

[7]  Peter Green,et al.  Markov chain Monte Carlo in Practice , 1996 .

[8]  M. Pitt,et al.  Filtering via Simulation: Auxiliary Particle Filters , 1999 .

[9]  Simon J. Godsill,et al.  On sequential Monte Carlo sampling methods for Bayesian filtering , 2000, Stat. Comput..

[10]  Thia Kirubarajan,et al.  Estimation with Applications to Tracking and Navigation: Theory, Algorithms and Software , 2001 .

[11]  W. Gilks,et al.  Following a moving target—Monte Carlo inference for dynamic Bayesian models , 2001 .

[12]  Mohinder S. Grewal,et al.  Kalman Filtering: Theory and Practice Using MATLAB , 2001 .

[13]  T. Başar,et al.  A New Approach to Linear Filtering and Prediction Problems , 2001 .

[14]  Jeffrey K. Uhlmann,et al.  Unscented filtering and nonlinear estimation , 2004, Proceedings of the IEEE.

[15]  A. Doucet,et al.  Monte Carlo Smoothing for Nonlinear Time Series , 2004, Journal of the American Statistical Association.

[16]  Nando de Freitas,et al.  Fast particle smoothing: if I had a million particles , 2006, ICML.

[17]  Simon J. Godsill,et al.  An Overview of Existing Methods and Recent Advances in Sequential Monte Carlo , 2007, Proceedings of the IEEE.

[18]  A. Doucet,et al.  A Tutorial on Particle Filtering and Smoothing: Fifteen years later , 2008 .

[19]  Aurélien Garivier,et al.  ON THE FORWARD FILTERING BACKWARD SMOOTHING PARTICLE APPROXIMATIONS OF THE SMOOTHING DISTRIBUTION IN GENERAL STATE SPACES MODELS , 2009, 0904.0316.

[20]  A. Doucet,et al.  Smoothing algorithms for state–space models , 2010 .

[21]  P. Fearnhead,et al.  A sequential smoothing algorithm with linear computational cost. , 2010 .

[22]  R. Douc,et al.  Particle approximation improvement of the joint smoothing distribution with on-the-fly variance estimation , 2011, 1107.5524.