On general sampling schemes for Particle Markov chain Monte Carlo methods

Particle Markov Chain Monte Carlo methods [Andrieu et al., 2010] are used to carry out inference in non-linear and non-Gaussian state space models, where the posterior density of the states is approximated using particles. Current approaches have usually carried out Bayesian inference using a particle Metropolis-Hastings algorithm or a particle Gibbs sampler. In this paper, we give a general approach to constructing sampling schemes that converge to the target distributions given in Andrieu et al. [2010] and Olsson and Ryden [2011]. We describe our methods as a particle Metropolis within Gibbs sampler (PMwG). The advantage of our general approach is that the sampling scheme can be tailored to obtain good results for different applications. We investigate the properties of the general sampling scheme, including conditions for uniform convergence to the posterior. We illustrate our methods with examples of state space models where one group of parameters can be generated in a straightforward manner in a particle Gibbs step by conditioning on the states, but where it is cumbersome and inefficient to generate such parameters when the states are integrated out. Conversely, it may be necessary to generate a second group of parameters without conditioning on the states because of the high dependence between such parameters and the states. Our examples include state space models with diffuse initial conditions, where we introduce two methods to deal with the initial conditions.

[1]  F. Lindsten,et al.  On the use of backward simulation in particle Markov chain Monte Carlo methods , 2011, 1110.2873.

[2]  Eric Moulines,et al.  Comparison of resampling schemes for particle filtering , 2005, ISPA 2005. Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis, 2005..

[3]  Tobias Rydén,et al.  Rao-Blackwellization of Particle Markov Chain Monte Carlo Methods Using Forward Filtering Backward Sampling , 2011, IEEE Transactions on Signal Processing.

[4]  J. Richard,et al.  Efficient high-dimensional importance sampling , 2007 .

[5]  Rong Chen,et al.  New sequential Monte Carlo methods for nonlinear dynamic systems , 2005, Stat. Comput..

[6]  R. Kohn,et al.  Markov chain Monte Carlo in conditionally Gaussian state space models , 1996 .

[7]  N. Gordon,et al.  Novel approach to nonlinear/non-Gaussian Bayesian state estimation , 1993 .

[8]  M. Netto,et al.  A New Spline Algorithm for Non-Linear Filtering of Discrete Time Systems , 1978 .

[9]  A. Doucet,et al.  Particle Markov chain Monte Carlo methods , 2010 .

[10]  L. Tierney A note on Metropolis-Hastings kernels for general state spaces , 1998 .

[11]  G. Kitagawa Monte Carlo Filter and Smoother for Non-Gaussian Nonlinear State Space Models , 1996 .

[12]  Nando de Freitas,et al.  The Unscented Particle Filter , 2000, NIPS.

[13]  Michael A. West,et al.  Combined Parameter and State Estimation in Simulation-Based Filtering , 2001, Sequential Monte Carlo Methods in Practice.

[14]  C. Andrieu,et al.  The pseudo-marginal approach for efficient Monte Carlo computations , 2009, 0903.5480.

[15]  S. Frühwirth-Schnatter Data Augmentation and Dynamic Linear Models , 1994 .

[16]  Ralph S. Silva,et al.  On Some Properties of Markov Chain Monte Carlo Simulation Methods Based on the Particle Filter , 2012 .

[17]  Nicholas G. Polson,et al.  A Monte Carlo Approach to Nonnormal and Nonlinear State-Space Modeling , 1992 .

[18]  Sylvia Frühwirth-Schnatter,et al.  Finite Mixture and Markov Switching Models , 2006 .

[19]  Geir Storvik,et al.  Particle filters for state-space models with the presence of unknown static parameters , 2002, IEEE Trans. Signal Process..

[20]  M. Pitt,et al.  Filtering via Simulation: Auxiliary Particle Filters , 1999 .

[21]  G. Wahba Bayesian "Confidence Intervals" for the Cross-validated Smoothing Spline , 1983 .

[22]  R. Kohn,et al.  On Gibbs sampling for state space models , 1994 .

[23]  Simon J. Godsill,et al.  On sequential Monte Carlo sampling methods for Bayesian filtering , 2000, Stat. Comput..

[24]  C. Ansley,et al.  The Signal Extraction Approach to Nonlinear Regression and Spline Smoothing , 1983 .

[25]  C. Andrieu,et al.  Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms , 2012, 1210.1484.

[26]  Sumeetpal S. Singh,et al.  On the Particle Gibbs Sampler , 2013 .

[27]  Gareth O. Roberts,et al.  Examples of Adaptive MCMC , 2009 .

[28]  Galin L. Jones,et al.  Fixed-Width Output Analysis for Markov Chain Monte Carlo , 2006, math/0601446.

[29]  N. Shephard,et al.  Stochastic Volatility: Likelihood Inference And Comparison With Arch Models , 1996 .

[30]  R. Kohn,et al.  Efficient Bayesian Inference for Dynamic Mixture Models , 2000 .

[31]  J. Rosenthal,et al.  General state space Markov chains and MCMC algorithms , 2004, math/0404033.

[32]  J. Rosenthal,et al.  Geometric Ergodicity and Hybrid Markov Chains , 1997 .

[33]  Nicholas G. Polson,et al.  Practical filtering with sequential parameter learning , 2008 .

[34]  M. Pitt,et al.  Likelihood analysis of non-Gaussian measurement time series , 1997 .

[35]  Robert Kohn,et al.  A structured state space approach to computing the likelihood of an ARIMA process and its derivatives , 1985 .

[36]  A. Doucet,et al.  Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator , 2012, 1210.1871.

[37]  A. Doucet,et al.  Monte Carlo Smoothing for Nonlinear Time Series , 2004, Journal of the American Statistical Association.

[38]  P. Moral,et al.  Sequential Monte Carlo for Bayesian Computation , 2006 .

[39]  Robert Kohn,et al.  Auxiliary particle ltering within adaptive Metropolis-Hastings sampling , 2010, 1006.1914.