Motion-based calibration of multimodal sensor arrays

This paper formulates a new pipeline for automated extrinsic calibration of multi-sensor mobile platforms. The new method can operate on any combination of cameras, navigation sensors and 3D lidars. Current methods for extrinsic calibration are either based on special markers and/or chequerboards, or they require a precise parameters initialisation for the calibration to converge. These two limitations prevent them from being fully automatic. The method presented in this paper removes these restrictions. By combining information extracted from both, platform's motion estimates and external observations, our approach eliminates the need for special markers and also removes the need for manual initialisation. A third advantage is that the motion-based automatic initialisation does not require overlapping field of view between sensors. The paper also provides a method to estimate the accuracy of the resulting calibration. We illustrate the generalisation of our approach and validate its performance by showing results with two contrasting datasets. The first dataset was collected in a city with a car platform, and the second one was collected in a tree-crop farm with a Segway platform.

[1]  Michal Havlena,et al.  Structure-from-motion based hand-eye calibration using L∞ minimization , 2011, CVPR 2011.

[2]  David Johnson,et al.  Automatic calibration of multi-modal sensor systems using a gradient orientation measure , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Juan I. Nieto,et al.  A mutual information approach to automatic calibration of camera and lidar in natural environments , 2012 .

[4]  Sebastian Thrun,et al.  Unsupervised Calibration for Multi-beam Lasers , 2010, ISER.

[5]  Frank P. Ferrie,et al.  Automatic registration of mobile LiDAR and spherical panoramas , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[6]  Anastasios I. Mourikis,et al.  High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[7]  Radu Horaud,et al.  Robot Hand-Eye Calibration Using Structure-from-Motion , 2001, Int. J. Robotics Res..

[8]  Peter I. Corke,et al.  Cross-calibration of push-broom 2D LIDARs and cameras in natural scenes , 2013, 2013 IEEE International Conference on Robotics and Automation.

[9]  Andreas Geiger,et al.  Automatic camera and range sensor calibration using a single shot , 2012, 2012 IEEE International Conference on Robotics and Automation.

[10]  Silvio Savarese,et al.  Automatic Targetless Extrinsic Calibration of a 3D Lidar and Camera by Maximizing Mutual Information , 2012, AAAI.

[11]  Paul Newman,et al.  TICSync: Knowing when things happened , 2011, 2011 IEEE International Conference on Robotics and Automation.

[12]  Marc Pollefeys,et al.  CamOdoCal: Automatic intrinsic and extrinsic calibration of a rig with multiple generic cameras and odometry , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Hans-Joachim Wünsche,et al.  Odometry-based online extrinsic sensor calibration , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  David Johnson,et al.  Multi‐Modal Sensor Calibration Using a Gradient Orientation Measure , 2015, J. Field Robotics.

[16]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..