Progressive Bayesian estimation for nonlinear discrete-time systems: the measurement step

This paper is concerned with estimating the internal state of a dynamic system by processing measurements taken from the system output. An exact analytic representation of the probability density functions characterizing the estimate may not be possible to obtain. Even when available, it may be too complex or not practical because, for example, recursive application is required. Hence, approximations are generally inevitable. Gaussian mixture approximations are convenient for a number of reasons. However, calculating appropriate mixture parameters that minimize a global measure of deviation from the true density is a tough optimization task. Here, we propose a approximation method that minimizes the squared integral deviation between the true density and its mixture approximation. Rather than trying to solve the original problem, it is converted into a corresponding system of explicit ordinary first-order differential equations. This system of differential equations is then solved over a finite "time" interval, which is an efficient way of calculating the desired optimal parameter values. For polynomial measurement nonlinearities, closed-form analytic expressions for the coefficients of the system of differential equations are derived.

[1]  Uwe D. Hanebeck,et al.  New results for stochastic prediction and filtering with unknown correlations , 2001, Conference Documentation International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI 2001 (Cat. No.01TH8590).

[2]  Ross D. Shachter,et al.  Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties , 1993, UAI.

[3]  Radford M. Neal Annealed importance sampling , 1998, Stat. Comput..

[4]  Naonori Ueda,et al.  Deterministic annealing EM algorithm , 1998, Neural Networks.

[5]  Uwe D. Hanebeck,et al.  Progressive Bayes: a new framework for nonlinear state estimation , 2003, SPIE Defense + Commercial Sensing.

[6]  Fredrik Gustafsson,et al.  Terrain navigation using Bayesian statistics , 1999 .

[7]  Geoffrey E. Hinton,et al.  SMEM Algorithm for Mixture Models , 1998, Neural Computation.

[8]  Fred C. Schweppe,et al.  Uncertain dynamic systems , 1973 .

[9]  Uwe D. Hanebeck,et al.  A tight bound for the joint covariance of two random vectors with unknown but constrained cross-correlation , 2001, Conference Documentation International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI 2001 (Cat. No.01TH8590).

[10]  H. Sorenson,et al.  Nonlinear Bayesian estimation using Gaussian sum approximations , 1972 .

[11]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[12]  Nikos A. Vlassis,et al.  A Greedy EM Algorithm for Gaussian Mixture Learning , 2002, Neural Processing Letters.

[13]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[14]  Athanasios Papoulis,et al.  Probability, Random Variables and Stochastic Processes , 1965 .

[15]  A. Rauh,et al.  Calculating moments of exponential densities using differential algebraic equations , 2003, IEEE Signal Processing Letters.

[16]  Bruno O. Shubert,et al.  Random variables and stochastic processes , 1979 .

[17]  Christian Musso,et al.  Improving Regularised Particle Filters , 2001, Sequential Monte Carlo Methods in Practice.