Integrating LIDAR into Stereo for Fast and Improved Disparity Computation

The fusion of stereo and laser range finders (LIDARs) has been proposed as a method to compensate for each individual sensor's deficiencies - stereo output is dense, but noisy for large distances, while LIDAR is more accurate, but sparse. However, stereo usually performs poorly on textureless areas and on scenes containing repetitive structures, and the subsequent fusion with LIDAR leads to a degraded estimation of the 3D structure. In this paper, we propose to integrate LIDAR data directly into the stereo algorithm to reduce false positives while increasing the density of the resulting disparity image on textureless regions. We demonstrate with extensive experimental results with real data that the disparity estimation is substantially improved while speeding up the stereo computation by as much as a factor of five.

[1]  William F. Oberle,et al.  Toward High Resolution, Ladar-Quality 3-D World Models Using Ladar-Stereo Data Integration and Fusion , 2005 .

[2]  John G. Apostolopoulos,et al.  Fusion of active and passive sensors for fast 3D capture , 2010, 2010 IEEE International Workshop on Multimedia Signal Processing.

[3]  Guido M. Cortelazzo,et al.  A Probabilistic Approach to ToF and Stereo Data Fusion , 2010 .

[4]  Marc Alexa,et al.  Depth Imaging by Combining Time-of-Flight and On-Demand Stereo , 2009, Dyn3D.

[5]  Marc Alexa,et al.  Combining Time-Of-Flight depth and stereo images without accurate extrinsic calibration , 2008, Int. J. Intell. Syst. Technol. Appl..

[6]  Klaus-Dieter Kuhnert,et al.  Fusion of Stereo-Camera and PMD-Camera Data for Real-Time Suited Precise 3D Environment Reconstruction , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  P. Litkey,et al.  INTEGRATION OF LASER SCANNING AND PHOTOGRAMMETRY , 2007 .

[8]  Richard Szeliski,et al.  A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms , 2001, International Journal of Computer Vision.

[9]  Kevin Nickels,et al.  Fusion of Lidar and Stereo Range for Mobile Robots , 2003 .

[10]  Ruigang Yang,et al.  Fusion of time-of-flight depth and stereo for high accuracy depth maps , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Rasmus Larsen,et al.  Fusion of stereo vision and Time-Of-Flight imaging for improved 3D estimation , 2008, Int. J. Intell. Syst. Technol. Appl..

[12]  John P. Kerekes,et al.  3D Scene Reconstruction through a Fusion of Passive Video and Lidar Imagery , 2007, 36th Applied Imagery Pattern Recognition Workshop (aipr 2007).

[13]  Mubarak Shah,et al.  Multi-sensor fusion: a perspective , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[14]  Young Min Kim,et al.  Multi-view image and ToF sensor fusion for dense 3D reconstruction , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.

[15]  Reinhard Koch,et al.  A Combined Approach for Estimating Patchlets from PMD Depth Images and Stereo Intensity Images , 2007, DAGM-Symposium.