Inertial-aided KLT feature tracking for a moving camera

We propose a novel inertial-aided KLT feature tracking method robust to camera ego-motions. The conventional KLT uses images only and its working condition is inherently limited to small appearance change between images. When big optical flows are induced by a camera-ego motion, an inertial sensor attached to the camera can provide a good prediction to preserve the tracking performance. We use a low-grade MEMS-based gyroscope to refine an initial condition of the nonlinear optimization in the KLT. It increases the possibility for warping parameters to be in the convergence region of the KLT. For longer tracking with less drift, we use the affine photometric model and it can effectively deal with camera rolling and outdoor illumination change. Extra computational cost caused by this higher-order motion model is alleviated by restraining the Hessian update and GPU acceleration. Experimental results are provided for both indoor and outdoor scenes and GPU implementation issues are discussed.

[1]  Jeffery R Gray,et al.  Deeply-Integrated Feature Tracking for Embedded Navigation , 2009 .

[2]  Jorge Dias,et al.  Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Didier Stricker,et al.  The MATRIS project: real-time markerless camera tracking for Augmented Reality and broadcast applications , 2007, Journal of Real-Time Image Processing.

[4]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[5]  Marc Pollefeys,et al.  Multiple view geometry , 2005 .

[6]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Kostas Daniilidis,et al.  Correspondenceless Ego-Motion Estimation Using an IMU , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[8]  Stefano Soatto,et al.  Real-Time Feature Tracking and Outlier Rejection with Changes in Illumination , 2001, ICCV.

[9]  Yakup Genc,et al.  GPU-based Video Feature Tracking And Matching , 2006 .

[10]  D. Whitteridge Movements of the eyes R. H. S. Carpenter, Pion Ltd, London (1977), 420 pp., $27.00 , 1979, Neuroscience.

[11]  Tsuneo Yoshikawa,et al.  Accurate image overlay on video see-through HMDs using vision and accelerometers , 2000, Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).

[12]  Ronald Azuma,et al.  Hybrid inertial and vision tracking for augmented reality registration , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[13]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Takayuki Okatani,et al.  Robust estimation of camera translation between two images using a camera with a 3D orientation sensor , 2002, Object recognition supported by user interaction for service robots.

[15]  Simon Baker,et al.  Lucas-Kanade 20 Years On: A Unifying Framework , 2004, International Journal of Computer Vision.

[16]  Reinhard Koch,et al.  Realtime Camera Tracking in the MATRIS Project , 2007, SMPTE Motion Imaging Journal.

[17]  J. Kulpa,et al.  Time-frequency analysis using NVIDIA compute unified device architecture (CUDA) , 2009, Symposium on Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments (WILGA).

[18]  Didier Stricker,et al.  Fast and Stable Tracking for AR fusing Video and Inertial Sensor Data , 2006 .