An experimental study on feature-based SLAM for multi-legged robots with RGB-D sensors

Purpose This paper aims to evaluate four different simultaneous localization and mapping (SLAM) systems in the context of localization of multi-legged walking robots equipped with compact RGB-D sensors. This paper identifies problems related to in-motion data acquisition in a legged robot and evaluates the particular building blocks and concepts applied in contemporary SLAM systems against these problems. The SLAM systems are evaluated on two independent experimental set-ups, applying a well-established methodology and performance metrics. Design/methodology/approach Four feature-based SLAM architectures are evaluated with respect to their suitability for localization of multi-legged walking robots. The evaluation methodology is based on the computation of the absolute trajectory error (ATE) and relative pose error (RPE), which are performance metrics well-established in the robotics community. Four sequences of RGB-D frames acquired in two independent experiments using two different six-legged walking robots are used in the evaluation process. Findings The experiments revealed that the predominant problem characteristics of the legged robots as platforms for SLAM are the abrupt and unpredictable sensor motions, as well as oscillations and vibrations, which corrupt the images captured in-motion. The tested adaptive gait allowed the evaluated SLAM systems to reconstruct proper trajectories. The bundle adjustment-based SLAM systems produced best results, thanks to the use of a map, which enables to establish a large number of constraints for the estimated trajectory. Research limitations/implications The evaluation was performed using indoor mockups of terrain. Experiments in more natural and challenging environments are envisioned as part of future research. Practical implications The lack of accurate self-localization methods is considered as one of the most important limitations of walking robots. Thus, the evaluation of the state-of-the-art SLAM methods on legged platforms may be useful for all researchers working on walking robots’ autonomy and their use in various applications, such as search, security, agriculture and mining. Originality/value The main contribution lies in the integration of the state-of-the-art SLAM methods on walking robots and their thorough experimental evaluation using a well-established methodology. Moreover, a SLAM system designed especially for RGB-D sensors and real-world applications is presented in details.

[1]  Krzysztof Walas,et al.  A Compact Walking Robot - Flexible Research and Development Platform , 2014, Recent Advances in Automation, Robotics and Measuring Techniques.

[2]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[3]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Heiko Hirschmüller,et al.  Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain , 2012, Int. J. Robotics Res..

[5]  Michal R. Nowicki,et al.  On the Performance of Pose-Based RGB-D Visual Navigation Systems , 2014, ACCV.

[6]  Dominik Belter,et al.  REAL-TIME SLAM FROM RGB-D DATA ON A LEGGED ROBOT: AN EXPERIMENTAL STUDY , 2016 .

[7]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[8]  Andrew W. Fitzgibbon,et al.  Bundle Adjustment - A Modern Synthesis , 1999, Workshop on Vision Algorithms.

[9]  Hauke Strasdat,et al.  Visual SLAM: Why filter? , 2012, Image Vis. Comput..

[10]  Adam Schmidt,et al.  Comparative Assessment of Point Feature Detectors and Descriptors in the Context of Robot Navigation , 2013 .

[11]  Jan Faigl,et al.  On localization and mapping with RGB-D sensor and hexapod walking robot in rough terrains , 2016, 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[12]  Marek Kraft,et al.  Comparative assessment of point feature detectors in the context of robot navigation , 2013 .

[13]  Andreas Zell,et al.  Efficient onbard RGBD-SLAM for autonomous MAVs , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Juan D. Tardós,et al.  Visual-Inertial Monocular SLAM With Map Reuse , 2016, IEEE Robotics and Automation Letters.

[15]  Hugh F. Durrant-Whyte,et al.  Simultaneous localization and mapping: part I , 2006, IEEE Robotics & Automation Magazine.

[16]  Javier Civera,et al.  Stereo parallel tracking and mapping for robot localization , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[17]  Jan Faigl,et al.  Tactile sensing with servo drives feedback only for blind hexapod walking robot , 2015, 2015 10th International Workshop on Robot Motion and Control (RoMoCo).

[18]  Krzysztof Walas,et al.  Lightweight RGB-D SLAM System for Search and Rescue Robots , 2015, Progress in Automation, Robotics and Measuring Techniques.

[19]  Wolfram Burgard,et al.  A Tutorial on Graph-Based SLAM , 2010, IEEE Intelligent Transportation Systems Magazine.

[20]  Jan Faigl,et al.  Stereo vision-based localization for hexapod walking robots operating in rough terrains , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[21]  John J. Leonard,et al.  Real-time large-scale dense RGB-D SLAM with volumetric fusion , 2014, Int. J. Robotics Res..

[22]  Libor Preucil,et al.  A Practical Multirobot Localization System , 2014, J. Intell. Robotic Syst..

[23]  Marek Kraft,et al.  Calibration of the Multi-camera Registration System for Visual Navigation Benchmarking , 2014 .

[24]  Wolfram Burgard,et al.  3-D Mapping With an RGB-D Camera , 2014, IEEE Transactions on Robotics.

[25]  Michal R. Nowicki,et al.  Improving accuracy of feature-based RGB-D SLAM by modeling spatial uncertainty of point features , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[26]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[27]  Adam Schmidt,et al.  The Visual SLAM System for a Hexapod Robot , 2010, ICCVG.

[28]  Piotr Skrzypczynski Laser scan matching for self-localization of a walking robot in man-made environments , 2012, Ind. Robot.

[29]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[30]  Wolfram Burgard,et al.  G2o: A general framework for graph optimization , 2011, 2011 IEEE International Conference on Robotics and Automation.

[31]  Piotr Skrzypczynski,et al.  Adaptive Motion Planning for Autonomous Rough Terrain Traversal with a Walking Robot , 2016, J. Field Robotics.

[32]  Jizhong Xiao,et al.  Fast visual odometry and mapping from RGB-D data , 2013, 2013 IEEE International Conference on Robotics and Automation.

[33]  Daniel Cremers,et al.  Dense visual SLAM for RGB-D cameras , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[34]  Piotr Skrzypczynski,et al.  Precise self-localization of a walking robot on rough terrain using parallel tracking and mapping , 2013, Ind. Robot.