Improved RGB-D vision SLAM algorithm for mobile robot

Simultaneous localization and mapping (SLAM) is one of the most challenging problems of mobile robots. An improved RGB-D visual SLAM algorithm is proposed in this paper, which based on the wheeled mobile robot to solve the problem of accumulative position error and large computation existed in the traditional algorithm. In order to overcome the error caused by the superposition of frames, a semi-random loop closures detection method is introduced at the backend of the graph optimization process. In this paper, we use the FR1/room data packet for experiments, and the experimental results show that the improved RGB-D visual SLAM algorithm saves computation time and enhances the real-time performance of the algorithm.

[1]  Jizhong Xiao,et al.  Fast visual odometry and mapping from RGB-D data , 2013, 2013 IEEE International Conference on Robotics and Automation.

[2]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[3]  Daniel Cremers,et al.  Dense visual SLAM for RGB-D cameras , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[5]  Antonio Fernández-Caballero,et al.  Mobile robot map building from time-of-flight camera , 2012, Expert Syst. Appl..

[6]  Andreas Zell,et al.  Using depth in visual simultaneous localisation and mapping , 2012, 2012 IEEE International Conference on Robotics and Automation.

[7]  Wolfram Burgard,et al.  An evaluation of the RGB-D SLAM system , 2012, 2012 IEEE International Conference on Robotics and Automation.

[8]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.