Multi-Sensor Depth Fusion Framework for Real-Time 3D Reconstruction

For autonomous robots, 3D perception of environment is an essential tool, which can be used to achieve better navigation in an obstacle rich environment. This understanding requires a huge amount of computational resources; therefore, the real-time 3D reconstruction of surrounding environment has become a topic of interest for countless researchers in the recent past. Generally, for the outdoor 3D models, stereo cameras and laser depth measuring sensors are employed. The data collected through the laser ranging sensors is relatively accurate but sparse in nature. In this paper, we propose a novel mechanism for the incremental fusion of this sparse data to the dense but limited ranged data provided by the stereo cameras, to produce accurate dense depth maps in real-time over a resource limited mobile computing device. Evaluation of the proposed method shows that it outperforms the state-of-the-art reconstruction frameworks which only utilizes depth information from a single source.

[1]  Klaus C. J. Dietmayer,et al.  Precise timestamping and temporal synchronization in multi-sensor fusion , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[2]  Heiko Hirschmüller,et al.  Stereo Processing by Semiglobal Matching and Mutual Information , 2008, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Olaf Hellwich,et al.  A regularized volumetric fusion framework for large-scale 3D reconstruction , 2018 .

[4]  I. Dowman,et al.  Data fusion of high-resolution satellite imagery and LiDAR data for automatic building extraction * , 2007 .

[5]  Daniel Cremers,et al.  Volumetric 3D mapping in real-time on a CPU , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Olaf Hellwich,et al.  Boundless Reconstruction Using Regularized 3D Fusion , 2016, ICETE.

[7]  Kevin Nickels,et al.  Fusion of Lidar and Stereo Range for Mobile Robots , 2003 .

[8]  Shahram Izadi,et al.  Modeling Kinect Sensor Noise for Improved 3D Reconstruction and Tracking , 2012, 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission.

[9]  Mubarak Shah,et al.  Multi-sensor fusion: a perspective , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[10]  Javier Gonzalez-Jimenez,et al.  Automatic Multi-Sensor Extrinsic Calibration For Mobile Robots , 2019, IEEE Robotics and Automation Letters.

[11]  Tee-Ann Teo,et al.  Fusion of lidar data and optical imagery for building modeling , 2004 .

[12]  T. Schenk FUSION OF LIDAR DATA AND AERIAL IMAGERY FOR A MORE COMPLETE SURFACE DESCRIPTION , 2002 .

[13]  Muhammad Asif Ali Rajput,et al.  A regularized fusion based 3D reconstruction framework: analyses, methods and applications , 2018 .

[14]  Stefan Leutenegger,et al.  ElasticFusion: Dense SLAM Without A Pose Graph , 2015, Robotics: Science and Systems.

[15]  SunXin,et al.  Very High Frame Rate Volumetric Integration of Depth Images on Mobile Devices , 2015 .

[16]  Klaus Dietmayer,et al.  Temporal synchronization in multi-sensor fusion for future driver assistance systems , 2011, 2011 IEEE International Symposium on Precision Clock Synchronization for Measurement, Control and Communication.

[17]  Paul Newman,et al.  Image and Sparse Laser Fusion for Dense Scene Reconstruction , 2009, FSR.

[18]  Marc Levoy,et al.  A volumetric method for building complex models from range images , 1996, SIGGRAPH.

[19]  Olaf Kähler,et al.  Very High Frame Rate Volumetric Integration of Depth Images on Mobile Devices , 2015, IEEE Transactions on Visualization and Computer Graphics.

[20]  Paul Newman,et al.  Real-time probabilistic fusion of sparse 3D LIDAR and dense stereo , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[21]  Olaf Hellwich,et al.  Recursive Total Variation Filtering Based 3D Fusion , 2016, SIGMAP.

[22]  Radu Horaud,et al.  High-resolution depth maps based on TOF-stereo fusion , 2012, 2012 IEEE International Conference on Robotics and Automation.

[23]  Peter A. Beling,et al.  Statistical Analysis-Based Error Models for the Microsoft Kinect™ Depth Sensor , 2014, Sensors.

[24]  Takeo Kanade,et al.  Integrating LIDAR into Stereo for Fast and Improved Disparity Computation , 2011, 2011 International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission.

[25]  K. Dietmayer,et al.  DATA SYNCHRONIZATION STRATEGIES FOR MULTI-SENSOR FUSION , 2003 .

[26]  Andreas Geiger,et al.  Vision meets robotics: The KITTI dataset , 2013, Int. J. Robotics Res..

[27]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[28]  Hicham Hadj-Abdelkader,et al.  A multi-sensor calibration toolbox for Kinect : Application to Kinect and laser range finder fusion. , 2013 .