Adaptive uncertainty estimation for particle filter-based trackers

In particle filter-based visual trackers, dynamic velocity components are typically incorporated into the state update equations. In these cases, there is a risk that the uncertainty in the model update stage can become amplified in unexpected and undesirable ways, leading to erroneous behavior of the tracker. To deal with this problem, we propose a continuously adaptive approach to estimating uncertainty in the particle filter, one that balances the uncertainty in its static and dynamic elements. We provide quantitative performance evaluation of the resulting particle filter tracker on a set of ten video sequences. Results are reported in terms of a metric that can be used to objectively evaluate the performance of visual trackers. This metric is used to compare our modified particle filter tracker and the continuously adaptive mean shift tracker. Results show that the performance of the particle filter is significantly improved through adaptive parameter estimation, particularly in cases of occlusions and nonlinear target motion.

[1]  Luc Van Gool,et al.  An adaptive color-based particle filter , 2003, Image Vis. Comput..

[2]  Jesse S. Jin,et al.  Tracking Using CamShift Algorithm and Multiple Quantized Feature Spaces , 2004, VIP.

[3]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[4]  Ihsin T. Phillips,et al.  Empirical Performance Evaluation of Graphics Recognition Systems , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[6]  Dorin Comaniciu,et al.  Kernel-Based Object Tracking , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Dieter Fox,et al.  KLD-Sampling: Adaptive Particle Filters , 2001, NIPS.