Lightweight RGB-D SLAM System for Search and Rescue Robots

Search and rescue robots ought to be autonomous, as it enables to keep the human personnel out of dangerous areas. To achieve desirable level of the autonomy both environment mapping and reliable self-localization have to be implemented. In this paper we analyse the application of a fast, lightweight RGB-D Simultaneous Localization and Mapping (SLAM) system for robots involved in indoor/outdoor search and rescue missions. We demonstrate that under some conditions the RGB-D sensors provide data reliable enough even for outdoor, real-time SLAM. Experiments are performed on a legged robot and a wheeled robot, using two representative RGB-D sensors: the Asus Xtion Pro Live and the recently introduced Microsoft Kinect ver. 2.

[1]  Michal R. Nowicki,et al.  On the Performance of Pose-Based RGB-D Visual Navigation Systems , 2014, ACCV.

[2]  Krzysztof Walas,et al.  A Compact Walking Robot - Flexible Research and Development Platform , 2014, Recent Advances in Automation, Robotics and Measuring Techniques.

[3]  Michał Nowicki,et al.  Experimental Verification of a Walking Robot Self-Localization System with the Kinect Sensor , 2013 .

[4]  John J. Leonard,et al.  Robust real-time visual odometry for dense RGB-D mapping , 2013, 2013 IEEE International Conference on Robotics and Automation.

[5]  Wolfram Burgard,et al.  3-D Mapping With an RGB-D Camera , 2014, IEEE Transactions on Robotics.

[6]  John J. Leonard,et al.  Deformation-based loop closure for large scale dense RGB-D SLAM , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Roman Szewczyk,et al.  Recent Advances in Automation, Robotics and Measuring Techniques , 2014, Recent Advances in Automation, Robotics and Measuring Techniques.

[8]  Michal R. Nowicki,et al.  Combining photometric and depth data for lightweight and robust visual odometry , 2013, 2013 European Conference on Mobile Robots.

[9]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[11]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[12]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Wolfram Burgard,et al.  G2o: A general framework for graph optimization , 2011, 2011 IEEE International Conference on Robotics and Automation.

[14]  Piotr Skrzypczynski Laser scan matching for self-localization of a walking robot in man-made environments , 2012, Ind. Robot.

[15]  Piotr Skrzypczynski,et al.  Precise self-localization of a walking robot on rough terrain using parallel tracking and mapping , 2013, Ind. Robot.

[16]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[17]  F. Fraundorfer,et al.  Visual Odometry : Part II: Matching, Robustness, Optimization, and Applications , 2012, IEEE Robotics & Automation Magazine.

[18]  Heiko Hirschmüller,et al.  Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain , 2012, Int. J. Robotics Res..

[19]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.

[20]  Axel Pinz,et al.  Computer Vision – ECCV 2006 , 2006, Lecture Notes in Computer Science.

[21]  Albert S. Huang,et al.  Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments , 2012, Int. J. Robotics Res..