Progressive Bayes: a new framework for nonlinear state estimation

This paper is concerned with recursively estimating the internal state of a nonlinear dynamic system by processing noisy measurements and the known system input. In the case of continuous states, an exact analytic representation of the probability density characterizing the estimate is generally too complex for recursive estimation or even impossible to obtain. Hence, it is replaced by a convenient type of approximate density characterized by a finite set of parameters. Of course, parameters are desired that systematically minimize a given measure of deviation between the (often unknown) exact density and its approximation, which in general leads to a complicated optimization problem. Here, a new framework for state estimation based on progressive processing is proposed. Rather than trying to solve the original problem, it is exactly converted into a corresponding system of explicit ordinary first-order differential equations. Solving this system over a finite "time" interval yields the desired optimal density parameters.

[1]  Geoffrey E. Hinton,et al.  SMEM Algorithm for Mixture Models , 1998, Neural Computation.

[2]  Radford M. Neal Annealed importance sampling , 1998, Stat. Comput..

[3]  Naonori Ueda,et al.  Deterministic annealing EM algorithm , 1998, Neural Networks.

[4]  Katarina Bartkova,et al.  Parameter tying for flexible speech recognition , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.

[5]  Fredrik Gustafsson,et al.  Terrain navigation using Bayesian statistics , 1999 .

[6]  H. Sorenson,et al.  Nonlinear Bayesian estimation using Gaussian sum approximations , 1972 .

[7]  Hermann Ney,et al.  A combined maximum mutual information and maximum likelihood approach for mixture density splitting , 1999, EUROSPEECH.

[8]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[9]  Ross D. Shachter,et al.  Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties , 1993, UAI.

[10]  Christian Musso,et al.  Improving Regularised Particle Filters , 2001, Sequential Monte Carlo Methods in Practice.

[11]  Nikos A. Vlassis,et al.  A Greedy EM Algorithm for Gaussian Mixture Learning , 2002, Neural Processing Letters.

[12]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[13]  Fred C. Schweppe,et al.  Uncertain dynamic systems , 1973 .