3D LiDAR-GPS/IMU Calibration Based on Hand-Eye Calibration Model for Unmanned Vehicle

For the unmanned vehicle, multi-line LiDAR (Light Detection and Ranging) and GPS/IMU are often used in conjunction for SLAM or the production of high-precision maps. To improve the accuracy of navigation as well as map building, the extrinsic parameters calibration of LiDAR and GPS/IMU is often required. In response to the problem of insufficient conditions in the calibration of LiDAR and GPS/IMU for unmanned vehicle, this paper proposes a decoupling method, grouping the extrinsic parameters for calibration, and finally merges the point clouds with the pose obtained by the GPS/IMU before and after calibration to get environment reconstruction results. With the reconstruction results before and after calibration, it is verified that the method proposed is effective.

[1]  Xingxing Zuo,et al.  An Efficient LiDAR-IMU Calibration Method Based on Continuous-Time Trajectory , 2019 .

[2]  Teresa A. Vidal-Calleja,et al.  3D Lidar-IMU Calibration Based on Upsampled Preintegrated Measurements for Motion Distortion Correction , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Andrew Zisserman,et al.  MLESAC: A New Robust Estimator with Application to Estimating Image Geometry , 2000, Comput. Vis. Image Underst..

[4]  Cyrill Stachniss,et al.  On Geometric Models and Their Accuracy for Extrinsic Sensor Calibration , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Andreas Geiger,et al.  Vision meets robotics: The KITTI dataset , 2013, Int. J. Robotics Res..

[6]  Michel Dhome,et al.  Hand-eye calibration , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.