Extrinsic calibration of camera and 3D laser sensor system

Robots are typically equipped with multiple complementary sensors such as cameras and laser range finders. Camera generally provides dense 2D information while range sensors give sparse and accurate depth information in the form of a set of 3D points. In order to represent the different data sources in a common coordinate system, extrinsic calibration is needed. This paper presents a pipeline for extrinsic calibration a zed setero camera with Velodyne LiDAR puck using a novel self-made 3D marker whose edges can be robustly detected in the image and 3d point cloud. Our approach first estimate the large sensor displacement using just a single frame. then we optimize the coarse results by finding the best align of edges in order to obtain a more accurate calibration. Finally, the ratio of the 3D points correctly projected onto proper image segments is used to evaluate the accuracy of calibration.

[1]  Andreas Geiger,et al.  Automatic camera and range sensor calibration using a single shot , 2012, 2012 IEEE International Conference on Robotics and Automation.

[2]  Robert Pless,et al.  Extrinsic calibration of a camera and laser range finder (improves camera calibration) , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[3]  J. Kittler,et al.  Comparative study of Hough Transform methods for circle finding , 1990, Image Vis. Comput..

[4]  Ying Wang,et al.  Edge extraction by merging 3D point cloud and 2D image data , 2013, 2013 10th International Conference and Expo on Emerging Technologies for a Smarter World (CEWIT).

[5]  Sebastian Thrun,et al.  A probabilistic framework for car detection in images using context and scale , 2012, 2012 IEEE International Conference on Robotics and Automation.

[6]  Shigeru Ando,et al.  Image Field Categorization and Edge/Corner Detection from Gradient Covariance , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Peter I. Corke,et al.  Cross-calibration of push-broom 2D LIDARs and cameras in natural scenes , 2013, 2013 IEEE International Conference on Robotics and Automation.

[8]  Roland Siegwart,et al.  Extrinsic self calibration of a camera and a 3D laser range finder from natural scenes , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Silvio Savarese,et al.  Automatic Targetless Extrinsic Calibration of a 3D Lidar and Camera by Maximizing Mutual Information , 2012, AAAI.

[10]  Michael Bosse,et al.  Line-based extrinsic calibration of range and image sensors , 2013, 2013 IEEE International Conference on Robotics and Automation.

[11]  Cristiano Premebida,et al.  Pedestrian detection combining RGB and dense LIDAR data , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[13]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  In-So Kweon,et al.  Generalized laser three-point algorithm for motion estimation of camera-laser fusion system , 2013, 2013 IEEE International Conference on Robotics and Automation.

[15]  Jong-Eun Ha,et al.  Extrinsic calibration of a camera and laser range finder using a new calibration structure of a plane with a triangular hole , 2012 .

[16]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[17]  Rui P. Rocha,et al.  Data Fusion Calibration for a 3D Laser Range Finder and a Camera using Inertial Data , 2009, ECMR.

[18]  Adam Herout,et al.  Calibration of RGB camera with velodyne LiDAR , 2014 .

[19]  Juan I. Nieto,et al.  Motion-based calibration of multimodal sensor arrays , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[20]  Silvio Savarese,et al.  Automatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information , 2015, J. Field Robotics.

[21]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Sebastian Thrun,et al.  Automatic Online Calibration of Cameras and Lasers , 2013, Robotics: Science and Systems.

[23]  Silvio Savarese,et al.  Combining 3D Shape, Color, and Motion for Robust Anytime Tracking , 2014, Robotics: Science and Systems.

[24]  Vincent Frémont,et al.  Extrinsic calibration between a multi-layer lidar and a camera , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[25]  Juan I. Nieto,et al.  A mutual information approach to automatic calibration of camera and lidar in natural environments , 2012 .

[26]  Andrew Zisserman,et al.  MLESAC: A New Robust Estimator with Application to Estimating Image Geometry , 2000, Comput. Vis. Image Underst..