A particle algorithm for sequential Bayesian parameter estimation and model selection

We describe a particle algorithm for the sequential Bayesian estimation of unknown static parameters. The algorithm combines sequential importance sampling (SIS) and Markov chain Monte Carlo (MCMC) to achieve computational efficiency and stability. In its most general form, the algorithm has three components: i) SIS; ii) a rejuvenation test; and iii) MCMC. Measurements are processed sequentially (with an artificial "time-line" if there is no natural one associated with the measurements) by SIS, which is computationally inexpensive. After each measurement is processed, the rejuvenation test checks whether the resulting SIS particles have to be rejuvenated. When indicated by the test, the particles are completely rejuvenated by MCMC, which removes errors that accumulate from SIS due to the finite number of particles, thus ensuring stability. Whenever possible, the SIS particles can be used to advantage in the MCMC. There is flexibility in the choice of the rejuvenation test as well as the MCMC method, with potential to increase the usefulness of the algorithm. In particular, by using reversible-jump MCNIC with multiple models, the algorithm can perform simultaneous model selection and parameter estimation. In this paper, we use a rejuvenation test based on a Kullback-Leibler distance that is easy to compute, and our choice of MCNIC is independent Metropolis-Hastings with a Gaussian proposal density. With these choices, we illustrate the use of the algorithm in two signal processing applications (passive source localization with angle-of-arrival and simultaneous weak signal detection and parameter estimation) involving both simulated and real data. The results demonstrate the algorithm's stability, its built-in protection against model overfitting, and tolerance to model mismatch.

[1]  N. G. Best,et al.  Dynamic conditional independence models and Markov chain Monte Carlo methods , 1997 .

[2]  Michael A. West,et al.  Combined Parameter and State Estimation in Simulation-Based Filtering , 2001, Sequential Monte Carlo Methods in Practice.

[3]  Neil J. Gordon,et al.  Editors: Sequential Monte Carlo Methods in Practice , 2001 .

[4]  Christophe Andrieu,et al.  Recursive Monte Carlo algorithms for parameter estimation in general state space models , 2001, Proceedings of the 11th IEEE Signal Processing Workshop on Statistical Signal Processing (Cat. No.01TH8563).

[5]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[6]  O. Cappé,et al.  Markov Chain Monte Carlo: 10 Years and Still Running! , 2000 .

[7]  Jun S. Liu,et al.  Sequential Monte Carlo methods for dynamic systems , 1997 .

[8]  Jun S. Liu,et al.  Monte Carlo strategies in scientific computing , 2001 .

[9]  Simon J. Godsill,et al.  Improvement Strategies for Monte Carlo Particle Filters , 2001, Sequential Monte Carlo Methods in Practice.

[10]  D. Rubin,et al.  Inference from Iterative Simulation Using Multiple Sequences , 1992 .

[11]  N. Gordon,et al.  Novel approach to nonlinear/non-Gaussian Bayesian state estimation , 1993 .

[12]  M. Schervish Theory of Statistics , 1995 .

[13]  Nando de Freitas,et al.  Sequential Monte Carlo Methods in Practice , 2001, Statistics for Engineering and Information Science.

[14]  N. Chopin A sequential particle filter method for static models , 2002 .

[15]  Andrew K. Chan,et al.  Linear frequency-modulated signal detection using Radon-ambiguity transform , 1998, IEEE Trans. Signal Process..

[16]  W. Gilks,et al.  Following a moving target—Monte Carlo inference for dynamic Bayesian models , 2001 .

[17]  P. Green Reversible jump Markov chain Monte Carlo computation and Bayesian model determination , 1995 .

[18]  Christophe Andrieu,et al.  Sequential Monte Carlo Methods for Optimal Filtering , 2001, Sequential Monte Carlo Methods in Practice.

[19]  Jun S. Liu,et al.  Sequential Imputations and Bayesian Missing Data Problems , 1994 .