Stereo vision-based localization for hexapod walking robots operating in rough terrains

This paper concerns the self-localization problem of a hexapod walking robot operating in rough terrains. Given that legged robots exhibit higher terrain passability than wheeled or tracked platforms when operating in harsh environments, they constitute a challenge for the localization techniques because the camera motion between consecutive frames can be arbitrary due to the motion gait and terrain irregularities. In this paper, we present and evaluate an inertially assisted Stereo Parallel Tracking and Mapping (S-PTAM) method deployed on a hexapod crawling robot in a rough terrain. The considered deployment scenario is motivated by autonomous navigation in an unknown environment in an open loop fashion. The reported results and comparison with an existing RGB-D SLAM technique show the feasibility of the proposed approach and its suitability for navigation of crawlers in harsh environments.

[1]  Heiko Hirschmüller,et al.  Stereo camera based navigation of mobile robots on rough terrain , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Jan Faigl,et al.  On localization and mapping with RGB-D sensor and hexapod walking robot in rough terrains , 2016, 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[3]  Sven Behnke,et al.  Combining Feature-Based and Direct Methods for Semi-dense Real-Time Stereo Visual Odometry , 2016, IAS.

[4]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Piotr Skrzypczynski,et al.  Precise self-localization of a walking robot on rough terrain using parallel tracking and mapping , 2013, Ind. Robot.

[6]  Andreas Zell,et al.  Efficient onbard RGBD-SLAM for autonomous MAVs , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Wolfram Burgard,et al.  An evaluation of the RGB-D SLAM system , 2012, 2012 IEEE International Conference on Robotics and Automation.

[8]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[9]  Rüdiger Dillmann,et al.  Localization of Walking Robots , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[10]  Roland Siegwart,et al.  Real-time visual-inertial mapping, re-localization and planning onboard MAVs in unknown environments , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[11]  Javier Civera,et al.  Stereo parallel tracking and mapping for robot localization , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[12]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[13]  Adam Schmidt,et al.  The Visual SLAM System for a Hexapod Robot , 2010, ICCVG.

[14]  Larry H. Matthies,et al.  Robust multi-sensor, day/night 6-DOF pose estimation for a dynamic legged vehicle in GPS-denied environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[15]  Heiko Hirschmüller,et al.  Multisensor data fusion for robust pose estimation of a six-legged walking robot , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Peter C. Cheeseman,et al.  Estimating uncertain spatial relationships in robotics , 1986, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[17]  Jan Faigl,et al.  Tactile sensing with servo drives feedback only for blind hexapod walking robot , 2015, 2015 10th International Workshop on Robot Motion and Control (RoMoCo).

[18]  Wolfram Burgard,et al.  OctoMap: an efficient probabilistic 3D mapping framework based on octrees , 2013, Autonomous Robots.

[19]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[20]  Hauke Strasdat,et al.  Real-time monocular SLAM: Why filter? , 2010, 2010 IEEE International Conference on Robotics and Automation.

[21]  Ryan M. Eustice,et al.  Real-Time Visual SLAM for Autonomous Underwater Hull Inspection Using Visual Saliency , 2013, IEEE Transactions on Robotics.

[22]  Wolfram Burgard,et al.  G2o: A general framework for graph optimization , 2011, 2011 IEEE International Conference on Robotics and Automation.

[23]  Cyrill Stachniss,et al.  On measuring the accuracy of SLAM algorithms , 2009, Auton. Robots.

[24]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[25]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[26]  Vincent Lepetit,et al.  BRIEF: Binary Robust Independent Elementary Features , 2010, ECCV.

[27]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[28]  Andreas Geiger,et al.  Vision meets robotics: The KITTI dataset , 2013, Int. J. Robotics Res..

[29]  Libor Preucil,et al.  A Practical Multirobot Localization System , 2014, J. Intell. Robotic Syst..

[30]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[31]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[32]  Heiko Hirschmüller,et al.  Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain , 2012, Int. J. Robotics Res..