A Study on Low-Drift State Estimation for Humanoid Locomotion, Using LiDAR and Kinematic-Inertial Data Fusion

Several humanoid robots will require to navigate in unsafe and unstructured environments, such as those after a disaster, for human assistance and support. To achieve this, humanoids require to construct in real-time, accurate maps of the environment and localize in it by estimating their base/pelvis state without any drift, using computationally efficient mapping and state estimation algorithms. While a multitude of Simultaneous Localization and Mapping (SLAM) algorithms exist, their localization relies on the existence of repeatable landmarks, which might not always be available in unstructured environments. Several studies also use stop-and-map procedures to map the environment before traversal, but this is not ideal for scenarios where the robot needs to be continuously moving to keep for instance the task completion time short. In this paper, we present a novel combination of the state-of-the-art odometry and mapping based on LiDAR data and state estimation based on the kinematics-inertial data of the humanoid. We present experimental evaluation of the introduced state estimation on the full-size humanoid robot WALK-MAN while performing locomotion tasks. Through this combination, we prove that it is possible to obtain low-error, high frequency estimates of the state of the robot, while moving and mapping the environment on the go.

[1]  Ji Zhang,et al.  Visual-lidar odometry and mapping: low-drift, robust, and fast , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Nikolaos G. Tsagarakis,et al.  Stabilization of bipedal walking based on compliance control , 2016, Auton. Robots.

[3]  Simona Nobili,et al.  Overlap-based ICP tuning for robust localization of a humanoid robot , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[4]  Paul J. Besl,et al.  Method for registration of 3-D shapes , 1992, Other Conferences.

[5]  Weiwei Huang,et al.  Decoupled state estimation for humanoids using full-body dynamics , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Nikolaos G. Tsagarakis,et al.  XBotCore: A Real-Time Cross-Robot Software Platform , 2017, 2017 First IEEE International Conference on Robotic Computing (IRC).

[7]  Olivier Stasse,et al.  Experimental evaluation of simple estimators for humanoid robots , 2017, 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids).

[8]  Nikolaos G. Tsagarakis,et al.  Vision-based foothold contact reasoning using curved surface patches , 2017, 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids).

[9]  O. H. Schmitt,et al.  A thermionic trigger , 1938 .

[10]  Nicholas Roy,et al.  State Estimation for Legged Robots: Consistent Fusion of Leg Kinematics and IMU , 2013 .

[11]  Nikos G. Tsagarakis,et al.  Overview of Gait Synthesis for the Humanoid COMAN , 2017 .

[12]  Roland Siegwart,et al.  Comparing ICP variants on real-world data sets , 2013, Auton. Robots.

[13]  Christopher G. Atkeson,et al.  Center of mass estimator for humanoids and its application in modelling error compensation, fall detection and prevention , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[14]  Nikolaos G. Tsagarakis,et al.  rxKinFu: Moving Volume KinectFusion for 3D Perception and Robotics , 2018, 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids).

[15]  Marilena Vendittelli,et al.  Vision-based Odometric Localization for humanoids using a kinematic EKF , 2012, 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012).

[16]  Panos Trahanias,et al.  Nonlinear State Estimation for Humanoid Robot Walking , 2018, IEEE Robotics and Automation Letters.

[17]  Andrea Tagliasacchi,et al.  Sparse Iterative Closest Point , 2013, Comput. Graph. Forum.

[18]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Nicholas Rotella,et al.  State estimation for a humanoid robot , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Jörn Malzahn,et al.  WALK‐MAN: A High‐Performance Humanoid Platform for Realistic Environments , 2017, J. Field Robotics.

[21]  Ji Zhang,et al.  LOAM: Lidar Odometry and Mapping in Real-time , 2014, Robotics: Science and Systems.

[22]  Seth J. Teller,et al.  Drift-free humanoid state estimation fusing kinematic, inertial and LIDAR sensing , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.