Line-based extrinsic calibration of range and image sensors

Creating rich representations of environments requires integration of multiple sensing modalities with complementary characteristics such as range and imaging sensors. To precisely combine multisensory information, the rigid transformation between different sensor coordinate systems (i.e., extrinsic parameters) must be estimated. The majority of existing extrinsic calibration techniques require one or multiple planar calibration patterns (such as checkerboards) to be observed simultaneously from the range and imaging sensors. The main limitation of these approaches is that they require modifying the scene with artificial targets. In this paper, we present a novel algorithm for extrinsically calibrating a range sensor with respect to an image sensor with no requirement of external artificial targets. The proposed method exploits natural linear features in the scene to precisely determine the rigid transformation between the coordinate frames. First, a set of 3D lines (plane intersection and boundary line segments) are extracted from the point cloud, and a set of 2D line segments are extracted from the image. Correspondences between the 3D and 2D line segments are used as inputs to an optimization problem which requires jointly estimating the relative translation and rotation between the coordinate frames. The proposed method is not limited to any particular types or configurations of sensors. To demonstrate robustness, efficiency and generality of the presented algorithm, we include results using various sensor configurations.

[1]  Silvio Savarese,et al.  Extrinsic Calibration of a 3D Laser Scanner and an Omnidirectional Camera , 2010 .

[2]  Silvio Savarese,et al.  Automatic Targetless Extrinsic Calibration of a 3D Lidar and Camera by Maximizing Mutual Information , 2012, AAAI.

[3]  Michael Bosse,et al.  Vision-based localization using an edge map extracted from 3D laser range data , 2010, 2010 IEEE International Conference on Robotics and Automation.

[4]  Yunhui Liu,et al.  An algorithm for extrinsic parameters calibration of a camera and a laser range finder using line features , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  S. B. Kang,et al.  Image deblurring using inertial measurement sensors , 2010, SIGGRAPH 2010.

[6]  Ioannis Stamos,et al.  Integrating Automated Range Registration with Multiview Geometry for the Photorealistic Modeling of Large-Scale Scenes , 2008, International Journal of Computer Vision.

[7]  Andreas Geiger,et al.  Automatic camera and range sensor calibration using a single shot , 2012, 2012 IEEE International Conference on Robotics and Automation.

[8]  Robert Pless,et al.  Extrinsic calibration of a camera and laser range finder (improves camera calibration) , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[9]  Roland Siegwart,et al.  Extrinsic self calibration of a camera and a 3D laser range finder from natural scenes , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Thierry Peynot,et al.  Reliable automatic camera-laser calibration , 2010, ICRA 2010.

[11]  Michael Bosse,et al.  Zebedee: Design of a Spring-Mounted 3-D Range Sensor with Application to Mobile Mapping , 2012, IEEE Transactions on Robotics.

[12]  ChenChao,et al.  Integrating Automated Range Registration with Multiview Geometry for the Photorealistic Modeling of Large-Scale Scenes , 2008 .

[13]  In-So Kweon,et al.  Capturing Village-level Heritages with a Hand-held Camera-Laser Fusion Sensor , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.

[14]  Michael Bosse,et al.  3D thermal mapping of building interiors using an RGB-D and thermal camera , 2013, 2013 IEEE International Conference on Robotics and Automation.