Learning Inertial Odometry for Dynamic Legged Robot State Estimation

This paper introduces a novel proprioceptive state estimator for legged robots based on a learned displacement measurement from IMU data. Recent research in pedestrian tracking has shown that motion can be inferred from inertial data using convolutional neural networks. A learned inertial displacement measurement can improve state estimation in challenging scenarios where leg odometry is unreliable, such as slipping and compressible terrains. Our work learns to estimate a displacement measurement from IMU data which is then fused with traditional leg odometry. Our approach greatly reduces the drift of proprioceptive state estimation, which is critical for legged robots deployed in vision and lidar denied environments such as foggy sewers or dusty mines. We compared results from an EKF and an incremental fixed-lag factor graph estimator using data from several real robot experiments crossing challenging terrains. Our results show a reduction of relative pose error by 37% in challenging scenarios when compared to a traditional kinematic-inertial estimator without learned measurement. We also demonstrate a 22% reduction in error when used with vision systems in visually degraded environments such as an underground mine.

[1]  Hannes Sommer,et al.  The Two-State Implicit Filter Recursive Estimation for Mobile Robots , 2018, IEEE Robotics and Automation Letters.

[2]  Maani Ghaffari Jadidi,et al.  Legged Robot State-Estimation Through Combined Forward Kinematic and Preintegrated Contact Factors , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Lorenz Wellhausen,et al.  Learning quadrupedal locomotion over challenging terrain , 2020, Science Robotics.

[4]  Maurice Fallon,et al.  Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry , 2019, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Daniel E. Koditschek,et al.  Sensor data fusion for body state estimation in a hexapod robot with dynamical gaits , 2005, IEEE Transactions on Robotics.

[6]  Frank Dellaert,et al.  IMU Preintegration on Manifold for Efficient Visual-Inertial Maximum-a-Posteriori Estimation , 2015, Robotics: Science and Systems.

[7]  Mingyang Li,et al.  IMU Data Processing For Inertial Aided Navigation: A Recurrent Neural Network Based Approach , 2021, 2021 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Christopher P. Reale,et al.  Multivariate Uncertainty in Deep Learning , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Wenxin Liu,et al.  TLIO: Tight Learned Inertial Odometry , 2020, IEEE Robotics and Automation Letters.

[10]  Maurice Fallon,et al.  Robust Legged Robot State Estimation Using Factor Graph Optimization , 2019, IEEE Robotics and Automation Letters.

[11]  Hendrik Kolvenbach,et al.  Towards autonomous inspection of concrete deterioration in sewers with legged robots , 2020, J. Field Robotics.

[12]  Maani Ghaffari Jadidi,et al.  Hybrid Contact Preintegration for Visual-Inertial-Contact State Estimation Using Factor Graphs , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[13]  Simona Nobili,et al.  Pronto: A Multi-Sensor State Estimator for Legged Robots in Real-World Scenarios , 2020, Frontiers in Robotics and AI.

[14]  Maurice Fallon,et al.  Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry , 2020, IEEE Robotics and Automation Letters.

[15]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Darwin G. Caldwell,et al.  Probabilistic Contact Estimation and Impact Detection for State Estimation of Quadruped Robots , 2017, IEEE Robotics and Automation Letters.

[17]  Sachini Herath,et al.  RoNIN: Robust Neural Inertial Navigation in the Wild: Benchmark, Evaluations, & New Methods , 2019, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Martin Brossard,et al.  AI-IMU Dead-Reckoning , 2019, IEEE Transactions on Intelligent Vehicles.

[19]  Agathoniki Trigoni,et al.  IONet: Learning to Cure the Curse of Drift in Inertial Odometry , 2018, AAAI.

[20]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).