Extrinsic calibration and usage of a single-point laser rangefinder and single camera

Laser and visual imagery have been broadly utilized in computer vision and mobile robotics applications because these sensors provide complementary information. So we focus attention on the fusion of 1-D laser rangefinder and camera. However, finding the transformation between the camera and the 1-D laser rangefinder is the first necessary step for the fusion of information. Many algorithms have been proposed to calibrate camera and 2-D or 3-D laser rangefinder, but few methods for 1-D laser rangefinder. In this paper, we propose a robust extrinsic calibration algorithm that is implemented easily and has small calibration error. Due to the 1-D laser rangefinder only returns a data in one dimension direction, it is difficult to build geometric constraint equations like 2-D laser rangefinder. So we are no longer constrained to build constraint equations to finish calibration. Due to the spot of the single-point laser rangefinder we commonly use is mostly invisible, we can determine the full calibration even without observing the laser rangefinder observation point in the camera image. We evaluate the proposed method demonstrating the efficiency and good behavior under noise. Finally we calibrate the installation error of camera utilizing the calibration result.

[1]  Joachim Hertzberg,et al.  An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor environments , 2003, Robotics Auton. Syst..

[2]  Jorge J. Moré,et al.  The Levenberg-Marquardt algo-rithm: Implementation and theory , 1977 .

[3]  Urbano Nunes,et al.  A Minimal Solution for the Extrinsic Calibration of a Camera and a Laser-Rangefinder , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Bruce H. Thomas,et al.  In-situ refinement techniques for outdoor geo-referenced models using mobile AR , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[5]  Wolfram Burgard,et al.  An efficient fastSLAM algorithm for generating maps of large-scale cyclic environments from raw laser range measurements , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[6]  Chen Long,et al.  Design and implementation of the tracing smart car system based on laser sensors , 2010, 2010 International Conference On Computer Design and Applications.

[7]  Jun Rekimoto,et al.  iCam: Precise at-a-Distance Interaction in the Physical Environment , 2006, Pervasive.

[8]  Ray A. Jarvis,et al.  Panoramic Vision and Laser Range Finder Fusion for Multiple Person Tracking , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Tobias Höllerer,et al.  Fast annotation and modeling with a single-point laser range finder , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[10]  Roland Siegwart,et al.  Extrinsic self calibration of a camera and a 3D laser range finder from natural scenes , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Robert Pless,et al.  Extrinsic calibration of a camera and laser range finder (improves camera calibration) , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[12]  Kenneth Levenberg A METHOD FOR THE SOLUTION OF CERTAIN NON – LINEAR PROBLEMS IN LEAST SQUARES , 1944 .

[13]  Jae-Soo Cho,et al.  Vision-based vehicle detection and inter-vehicle distance estimation for driver alarm system , 2012, 2012 12th International Conference on Control, Automation and Systems.