A new algorithm for robot localization using monocular vision and inertia/odometry sensors

Vision and inertia/odometry sensors fusion strategy is popular in the recent years for the robot localization, due to its feasibility in GPS-denied environments. In this paper, a new adaptive estimation algorithm, inspired by the Slotine-Li adaptive control algorithm, is designed to fuse the monocular vision and inertia/odometry sensors for estimating the robot position. By the new method, the robot can be localized in GPS-free and map-free environments, and the localization results can be theoretically proved convergent to their real values and robust to the measurement noises. Comparing with other methods, our algorithm is simple to implement and suitable for parallel processing. To achieve the real-time performance, the algorithm is implemented in parallel using GPU, therefore it can be easily integrated into control tasks which need the real-time robot localization information.

[1]  Katsuhiko Sakaue,et al.  Real-time camera parameter estimation from images for a mixed reality system , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[2]  Antonio González,et al.  Mobile Robot Map-Based Localization using approximate locations and the Extended Kalman Filter , 2005 .

[3]  Sebastian Thrun,et al.  FastSLAM: a factored solution to the simultaneous localization and mapping problem , 2002, AAAI/IAAI.

[4]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[5]  F. Moita,et al.  Sensor fusion for precise autonomous vehicle navigation in outdoor semi-structured environments , 2005, Proceedings. 2005 IEEE Intelligent Transportation Systems, 2005..

[6]  Andrew W. Fitzgibbon,et al.  Bundle Adjustment - A Modern Synthesis , 1999, Workshop on Vision Algorithms.

[7]  R. Jarvis,et al.  A Review of Robotic SLAM , 2007 .

[8]  Simon Lacroix,et al.  Vision-Based SLAM: Stereo and Monocular Approaches , 2007, International Journal of Computer Vision.

[9]  Wu Chen,et al.  Adaptive Kalman Filtering for Vehicle Navigation , 2003 .

[10]  Sebastian Thrun,et al.  Map-Based Precision Vehicle Localization in Urban Environments , 2007, Robotics: Science and Systems.

[11]  Frank Dellaert,et al.  Probabilistic structure matching for visual SLAM with a multi-camera rig , 2010, Comput. Vis. Image Underst..

[12]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[13]  Weiping Li,et al.  Applied Nonlinear Control , 1991 .

[14]  Joseph G. Loomis,et al.  PERFORMANCE DEVELOPMENT OF A REAL-TIME VISION SYSTEM , 2003 .

[15]  R. Husson,et al.  The dead reckoning localization system of the wheeled mobile robot ROMANE , 1996, 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems (Cat. No.96TH8242).

[16]  Hauke Strasdat,et al.  Real-time monocular SLAM: Why filter? , 2010, 2010 IEEE International Conference on Robotics and Automation.

[17]  Salah Sukkarieh,et al.  Inertial Aiding of Inverse Depth SLAM using a Monocular Camera , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[18]  Michel Dhome,et al.  Monocular Vision Based SLAM for Mobile Robots , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[19]  Stergios I. Roumeliotis,et al.  On the complexity and consistency of UKF-based SLAM , 2009, 2009 IEEE International Conference on Robotics and Automation.

[20]  Christopher Hunt,et al.  Notes on the OpenSURF Library , 2009 .

[21]  X. Cufi,et al.  Integrating visual odometry and dead-reckoning for robot localization and obstacle detection , 2010, 2010 IEEE International Conference on Automation, Quality and Testing, Robotics (AQTR).