This paper is concerned with recursively estimating the internal state sequence of a discrete-time dynamic system by processing a sequence of noisy measurements taken from the system output. Recursive processing requires some kind of sufficient statistic for representing the information collected up to a certain time step. For this purpose, the probability density functions of the state are especially well suited. Once they are available, almost any type of point estimate, e.g. mean, mode, or median, can be derived. In the case of continuous states, however, the exact probability density functions characterizing the state estimate are in general either not feasible or not well suited for recursive processing. Hence, approximations of the true densities are generally inevitable, where Gaussian mixture approximations are convenient for a number of reasons. However, calculating appropriate mixture parameters that minimize a global measure of deviation from the true density is a tough optimization task. Here, we propose a new approximation method that minimizes the squared integral deviation between the true density and its mixture approximation. Rather than trying to solve the original problem, it is converted into a corresponding system of explicit ordinary first-order differential equations. This system of differential equations is then solved over a finite "time" interval, which is an efficient way of calculating the desired optimal parameter values. We focus on the measurement update in the important case of vector states and scalar measurements. In addition, approximation densities with separable kernels are assumed. It will be shown, that if the measurement nonlinearities are also separable, the required multidimensional integrals can be reduced to the product of one-dimensional integrals. For several important types of measurement functions including polynomial measurement nonlinearities, closed-form analytic expressions for the coefficients of the system of differential equations are available.
[1]
Naonori Ueda,et al.
Deterministic annealing EM algorithm
,
1998,
Neural Networks.
[2]
Geoffrey E. Hinton,et al.
SMEM Algorithm for Mixture Models
,
1998,
Neural Computation.
[3]
Uwe D. Hanebeck.
Progressive Bayesian estimation for nonlinear discrete-time systems: the measurement step
,
2003,
Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003..
[4]
Timothy J. Robinson,et al.
Sequential Monte Carlo Methods in Practice
,
2003
.
[5]
Fredrik Gustafsson,et al.
Terrain navigation using Bayesian statistics
,
1999
.
[6]
H. Sorenson,et al.
Nonlinear Bayesian estimation using Gaussian sum approximations
,
1972
.
[7]
Neil J. Gordon,et al.
A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking
,
2002,
IEEE Trans. Signal Process..
[8]
Nikos A. Vlassis,et al.
A Greedy EM Algorithm for Gaussian Mixture Learning
,
2002,
Neural Processing Letters.
[9]
Fred C. Schweppe,et al.
Uncertain dynamic systems
,
1973
.
[10]
Neil J. Gordon,et al.
Editors: Sequential Monte Carlo Methods in Practice
,
2001
.
[11]
Uwe D. Hanebeck,et al.
Progressive Bayes: a new framework for nonlinear state estimation
,
2003,
SPIE Defense + Commercial Sensing.
[12]
Radford M. Neal.
Annealed importance sampling
,
1998,
Stat. Comput..
[13]
Ross D. Shachter,et al.
Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties
,
1993,
UAI.
[14]
A. Rauh,et al.
Calculating moments of exponential densities using differential algebraic equations
,
2003,
IEEE Signal Processing Letters.