Analytic fusion of visual cues in model-based camera tracking

Model-based camera tracking is a technology that estimates a camera pose by tracking visual cues, i.e. points and edges on a known 3D scene model, in camera images. In model-based camera tracking, it has been a main challenge how to use the visual cues effectively for better performance. In this paper, we carefully analyze the dependency of the visual cues on tracking conditions (or environments) and propose a formula for integrating the visual cues cooperatively into a single framework based on the analysis. Then, we demonstrate that the analytic integration outperforms separate use of either cue and expedient integration of visual cues in arbitrary environments through experiments with synthetic camera images for which ground truth camera poses are given.

[1]  Éric Marchand,et al.  Real-time 3D model-based tracking: combining edge and texture information , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[2]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Vincent Lepetit,et al.  Combining edge and texture information for real-time accurate 3D camera tracking , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[4]  Jong-Il Park,et al.  Object-adaptive tracking for AR guidance system , 2008, VRCAI '08.

[5]  Fadi Dornaika,et al.  Pose Estimation using Point and Line Correspondences , 1999, Real Time Imaging.

[6]  Marc Pollefeys,et al.  Multiple view geometry , 2005 .

[7]  Bruno Lameyre,et al.  SAP: A robust approach to track objects in video streams with Snakes And Points , 2004, BMVC.

[8]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[9]  Andrew Zisserman,et al.  Multiple View Geometry , 1999 .

[10]  Timothy B. Terriberry,et al.  GPU Accelerating Speeded-Up Robust Features , 2008 .

[11]  Christopher Hunt,et al.  Notes on the OpenSURF Library , 2009 .

[12]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[13]  Stefan Winkler,et al.  A no-reference perceptual blur metric , 2002, Proceedings. International Conference on Image Processing.

[14]  Tom Drummond,et al.  Fusing points and lines for high performance tracking , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.