Estimating uncertainty in SSD-based feature tracking

Abstract Sum-of-squared-differences (SSD) based feature trackers have enjoyed growing popularity in recent years, particularly in the field of visual servo control of robotic manipulators. These trackers use SSD correlation measures to locate target features in sequences of images. The results can then be used to estimate the motion of objects in the scene, to infer the 3D structure of the scene, or to control robot motions. The reliability of the information provided by these trackers can be degraded by a variety of factors, including changes in illumination, poor image contrast, occlusion of features, or unmodeled changes in objects. This has led other researchers to develop confidence measures that are used to either accept or reject individual features that are located by the tracker. In this paper, we derive quantitative measures for the spatial uncertainty of the results provided by SSD-based feature trackers. Unlike previous confidence measures that have been used only to accept or reject hypotheses, our new measure allows the uncertainty associated with a feature to be used to weight its influence on the overall tracking process. Specifically, we scale the SSD correlation surface, fit a Gaussian distribution to this surface, and use this distribution to estimate values for a covariance matrix. We illustrate the efficacy of these measures by showing the performance of an example object tracking system with and without the measures.

[1]  Kevin Nickels,et al.  Model-based tracking of complex articulated objects , 2001, IEEE Trans. Robotics Autom..

[2]  Avinash C. Kak,et al.  Fast vision-guided mobile robot navigation using model-based reasoning and prediction of uncertainties , 1992, CVGIP Image Underst..

[3]  David G. Lowe,et al.  Robust model-based motion tracking through the integration of search and estimation , 1992, International Journal of Computer Vision.

[4]  Roberto Brunelli,et al.  Face Recognition: Features Versus Templates , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Azriel Rosenfeld,et al.  Digital Picture Processing , 1976 .

[6]  Ajit Singh,et al.  An estimation-theoretic framework for image-flow computation , 1990, [1990] Proceedings Third International Conference on Computer Vision.

[7]  Kevin Nickels,et al.  Weighting observations: the use of kinematic models in object tracking , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[8]  Gregory D. Hager Real-time feature tracking and projective invariance as a basis for hand-eye coordination , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[9]  Gregory D. Hager,et al.  Integrated Object Models for Robust Visual Tracking , 2000 .

[10]  Gregory D. Hager,et al.  Model-based 3D object tracking using projective invariance , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[11]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[12]  Peter Allen,et al.  Image-flow computation: An estimation-theoretic framework and a unified perspective , 1992, CVGIP Image Underst..

[13]  Avinash C. Kak,et al.  Fast Vision-guided Mobile Robot Navigation Using Model-based Reasoning And Prediction Of Uncertainties , 1992, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Rachid Deriche,et al.  Tracking line segments , 1990, Image Vis. Comput..

[15]  Dinesh K. Pai,et al.  Model-based telerobotics with vision , 1997, Proceedings of International Conference on Robotics and Automation.

[16]  William J. Wilson,et al.  Relative end-effector control using Cartesian position based visual servoing , 1996, IEEE Trans. Robotics Autom..

[17]  Gregory D. Hager,et al.  Real-time tracking of image regions with changes in geometry and illumination , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[18]  Nikolaos Papanikolopoulos,et al.  Selection of features and evaluation of visual measurements during robotic visual servoing tasks , 1995, J. Intell. Robotic Syst..

[19]  Donald B. Gennery,et al.  Visual tracking of known three-dimensional objects , 1992, International Journal of Computer Vision.

[20]  Fredrik Gustafsson,et al.  Comparison of some Kalman filter based methods for manoeuvre tracking and detection , 1995, Proceedings of 1995 34th IEEE Conference on Decision and Control.

[21]  Gregory D. Hager,et al.  Efficient Region Tracking With Parametric Models of Geometry and Illumination , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  S. Birchfiled A Pixel Dissimilarity Measure That Is Insensitive to Image Sampling , 1998 .

[23]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[24]  P. Anandan,et al.  A computational framework and an algorithm for the measurement of visual motion , 1987, International Journal of Computer Vision.

[25]  A. Zolghadri An algorithm for real-time failure detection in Kalman filters , 1996, IEEE Trans. Autom. Control..

[26]  Gregory D. Hager,et al.  Robust Vision for Vision-Based Control of Motion , 1999 .