Tightly Coupled 3D Lidar Inertial Odometry and Mapping

Ego-motion estimation is a fundamental requirement for most mobile robotic applications. By sensor fusion, we can compensate the deficiencies of stand-alone sensors and provide more reliable estimations. We introduce a tightly coupled lidar-IMU fusion method in this paper. By jointly minimizing the cost derived from lidar and IMU measurements, the lidarIMU odometry (LIO) can perform well with considerable drifts after long-term experiment, even in challenging cases where the lidar measurement can be degraded. Besides, to obtain more reliable estimations of the lidar poses, a rotation-constrained refinement algorithm (LIO-mapping) is proposed to further align the lidar poses with the global map. The experiment results demonstrate that the proposed method can estimate the poses of the sensor pair at the IMU update rate with high precision, even under fast motion conditions or with insufficient features.

[1]  Anastasios I. Mourikis,et al.  Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera , 2013, 2013 IEEE International Conference on Robotics and Automation.

[2]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Ji Zhang,et al.  LOAM: Lidar Odometry and Mapping in Real-time , 2014, Robotics: Science and Systems.

[4]  Soohwan Kim,et al.  Complementary Perception for Handheld SLAM , 2018, IEEE Robotics and Automation Letters.

[5]  Michael Bosse,et al.  Continuous 3D scan-matching with a spinning 2D laser , 2009, 2009 IEEE International Conference on Robotics and Automation.

[6]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[7]  Hyun Chul Roh,et al.  Complex Urban LiDAR Data Set , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Cyrill Stachniss,et al.  Efficient Surfel-Based SLAM using 3D Laser Range Data in Urban Environments , 2018, Robotics: Science and Systems.

[9]  Andrey Soloviev,et al.  Tight Coupling of Laser Scanner and Inertial Measurements for a Fully Autonomous Relative Navigation Solution , 2007 .

[10]  Sridha Sridharan,et al.  Elastic LiDAR Fusion: Dense Map-Centric Continuous-Time SLAM , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[12]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Li Wang,et al.  LiDAR Scan Matching Aided Inertial Navigation System in GNSS-Denied Environments , 2015, Sensors.

[14]  Juan D. Tardós,et al.  Visual-Inertial Monocular SLAM With Map Reuse , 2016, IEEE Robotics and Automation Letters.

[15]  Yun-Hui Liu,et al.  Odometry-Vision-Based Ground Vehicle Motion Estimation With SE(2)-Constrained SE(3) Poses , 2019, IEEE Transactions on Cybernetics.

[16]  Michael Kaess,et al.  Long-range GPS-denied aerial inertial navigation with LIDAR localization , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[17]  Stefano Soatto,et al.  Visual-inertial navigation, mapping and localization: A scalable real-time causal approach , 2011, Int. J. Robotics Res..

[18]  Haoyang Ye,et al.  Supplementary Material to : Tightly Coupled 3 D Lidar Inertial Odometry and Mapping , 2018 .