Using optical flow for filling the gaps in visual-inertial tracking

During the last decades egomotion tracking has been an often addressed problem. Hybrid approaches evidentially have potential to provide accurate, efficient and robust results. Simultaneous localisation and mapping (SLAM) - in contrast to model-based approaches - is used to enable tracking in unknown environments. However, it also suffers from high computational complexity. Moreover, in many applications, the map itself is not needed and the target environment is partially known, e.g. in a few 3D anchor points. In this paper, rather than using SLAM, optical flow measurements are introduced into a model-based system. With these measurements, a modified visual-inertial tracking method is derived, which in Monte Carlo simulations reduces the need for 3D points and thus allows tracking during extended gaps of 3D point registrations.

[1]  S. Shankar Sastry,et al.  An Invitation to 3-D Vision , 2004 .

[2]  Udo Frese Treemap: An O(log n) algorithm for indoor simultaneous localization and mapping , 2006, Auton. Robots.

[3]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[4]  Ian D. Reid,et al.  Locally Planar Patch Features for Real-Time Structure from Motion , 2004, BMVC.

[5]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[6]  Didier Stricker,et al.  Using the marginalised particle filter for real-time visual-inertial sensor fusion , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[7]  Peter Corke,et al.  An Introduction to Inertial and Visual Sensing , 2007, Int. J. Robotics Res..

[8]  Thomas B. Schön,et al.  A framework for simultaneous localization and mapping utilizing model structure , 2007, 2007 10th International Conference on Information Fusion.

[9]  Tom Drummond,et al.  Scalable Monocular SLAM , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[10]  Ian D. Reid,et al.  Mapping Large Loops with a Single Hand-Held Camera , 2007, Robotics: Science and Systems.

[11]  Didier Stricker,et al.  Real-time vision-based tracking and reconstruction , 2007, Journal of Real-Time Image Processing.

[12]  Sebastian Thrun,et al.  FastSLAM: a factored solution to the simultaneous localization and mapping problem , 2002, AAAI/IAAI.

[13]  Surya P. N. Singh,et al.  Attitude Estimation for Dynamic Legged Locomotion Using Range and Inertial Sensors , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[14]  Thomas B. Schön,et al.  Robust real-time tracking by fusing measurements from inertial and vision sensors , 2007, Journal of Real-Time Image Processing.

[15]  Patric Jensfelt,et al.  EKF SLAM updates in O(n) with Divide and Conquer SLAM , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[16]  Marie-Odile Berger,et al.  Use of inertial sensors to support video tracking , 2007, Comput. Animat. Virtual Worlds.

[17]  Olivier Faugeras,et al.  3D Dynamic Scene Analysis , 1992 .

[18]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[19]  Anders Heyden,et al.  Recursive Structure and Motion Estimation Based on Hybrid Matching Constraints , 2007, SCIA.

[20]  Farid Kendoul,et al.  Three Nested Kalman Filters-Based Algorithm for Real-Time Estimation of Optical Flow, UAV Motion and Obstacles Detection , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[21]  Didier Stricker,et al.  Advanced tracking through efficient image processing and visual-inertial sensor fusion , 2008, 2008 IEEE Virtual Reality Conference.

[22]  Tom Drummond,et al.  Going out: robust model-based tracking for outdoor augmented reality , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[23]  Tom Drummond,et al.  Monocular SLAM as a Graph of Coalesced Observations , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[24]  Thomas B. Schön,et al.  A new algorithm for calibrating a combined camera and IMU sensor unit , 2008, 2008 10th International Conference on Control, Automation, Robotics and Vision.

[25]  Salah Sukkarieh,et al.  Inertial Aiding of Inverse Depth SLAM using a Monocular Camera , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[26]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[27]  Michel Dhome,et al.  Real Time Localization and 3D Reconstruction , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[28]  Didier Stricker,et al.  Online camera pose estimation in partially known and dynamic scenes , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[29]  Tom Duckett,et al.  Mini-SLAM: Minimalistic Visual SLAM in Large-Scale Environments Based on a New Interpretation of Image Similarity , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[30]  A. Jazwinski Stochastic Processes and Filtering Theory , 1970 .

[31]  Fredrik Gustafsson,et al.  Adaptive filtering and change detection , 2000 .

[32]  Surya P. N. Singh,et al.  Motion Estimation by Optical Flow and Inertial Measurements for Dynamic Legged Locomotion , 2005 .