LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping

We propose a framework for tightly-coupled lidar inertial odometry via smoothing and mapping, LIO-SAM, that achieves highly accurate, real-time mobile robot trajectory estimation and map-building. LIO-SAM formulates lidar-inertial odometry atop a factor graph, allowing a multitude of relative and absolute measurements, including loop closures, to be incorporated from different sources as factors into the system. The estimated motion from inertial measurement unit (IMU) pre-integration de-skews point clouds and produces an initial guess for lidar odometry optimization. The obtained lidar odometry solution is used to estimate the bias of the IMU. To ensure high performance in real-time, we marginalize old lidar scans for pose optimization, rather than matching lidar scans to a global map. Scan-matching at a local scale instead of a global scale significantly improves the real-time performance of the system, as does the selective introduction of keyframes, and an efficient sliding window approach that registers a new keyframe to a fixed-size set of prior ``sub-keyframes.'' The proposed method is extensively evaluated on datasets gathered from three platforms over various scales and environments.

[1]  Aboelmagd Noureldin,et al.  INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm , 2015, Sensors.

[2]  Frank Dellaert,et al.  On-Manifold Preintegration for Real-Time Visual--Inertial Odometry , 2015, IEEE Transactions on Robotics.

[3]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Ji Zhang,et al.  Low-drift and real-time lidar odometry and mapping , 2017, Auton. Robots.

[5]  Adam Herout,et al.  Collar Line Segments for fast odometry estimation from Velodyne point clouds , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Ming Liu,et al.  LINS: A Lidar-Inerital State Estimator for Robust and Fast Navigation , 2019, ArXiv.

[7]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[8]  Abel Gawel,et al.  Local Descriptor for Robust Place Recognition Using LiDAR Intensity , 2018, IEEE Robotics and Automation Letters.

[9]  Kikuo Fujimura,et al.  Robust Localization with Low-Mounted Multiple LiDARs in Urban Environments , 2019, 2019 IEEE Intelligent Transportation Systems Conference (ITSC).

[10]  Laurent Itti,et al.  Finding planes in LiDAR point clouds for real-time registration , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Ming Liu,et al.  R-LINS: A Robocentric Lidar-Inertial State Estimator for Robust and Efficient Navigation , 2019 .

[12]  Ming Liu,et al.  Tightly Coupled 3D Lidar Inertial Odometry and Mapping , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[13]  Brendan Englot,et al.  Bayesian Generalized Kernel Inference for Terrain Traversability Mapping , 2018, CoRL.

[14]  Teresa A. Vidal-Calleja,et al.  IN2LAMA: INertial Lidar Localisation And MApping , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[15]  Frank Dellaert,et al.  iSAM2: Incremental smoothing and mapping using the Bayes tree , 2012, Int. J. Robotics Res..

[16]  Thomas Moore,et al.  A Generalized Extended Kalman Filter Implementation for the Robot Operating System , 2014, IAS.

[17]  Vahram Stepanyan,et al.  3D LiDAR SLAM Integration with GPS/INS for UAVs in Urban GPS-Degraded Environments , 2017 .

[18]  Lu Feng,et al.  A robust pose graph approach for city scale LiDAR mapping , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[19]  Ayoung Kim,et al.  Scan Context: Egocentric Spatial Descriptor for Place Recognition Within 3D Point Cloud Map , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[21]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Brendan Englot,et al.  LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[23]  Frank Dellaert,et al.  Factor Graphs for Robot Perception , 2017, Found. Trends Robotics.

[24]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[25]  Hua Zhu,et al.  A Review of Visual-Inertial Simultaneous Localization and Mapping from Filtering-Based and Optimization-Based Perspectives , 2018, Robotics.