Laser-Radar Data Fusion with Gaussian Process Implicit Surfaces

This work considers the problem of building high-fidelity 3D representations of the environment from sensor data acquired by mobile robots. Multi-sensor data fusion allows for more complete and accurate representations, and for more reliable perception, especially when different sensing modalities are used. In this paper, we propose a thorough experimental analysis of the performance of 3D surface reconstruction from laser and mm-wave radar data using Gaussian Process Implicit Surfaces (GPIS), in a realistic field robotics scenario. We first analyse the performance of GPIS using raw laser data alone and raw radar data alone, respectively, with different choices of covariance matrices and different resolutions of the input data. We then evaluate and compare the performance of two different GPIS fusion approaches. The first, state-of-the-art approach directly fuses raw data from laser and radar. The alternative approach proposed in this paper first computes an initial estimate of the surface from each single source of data, and then fuses these two estimates. We show that this method outperforms the state of the art, especially in situations where the sensors react differently to the targets they perceive.

[1]  Ashish Kapoor,et al.  Mixture of Gaussian Processes for Combining Multiple Modalities , 2005, Multiple Classifier Systems.

[2]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[3]  Wolfram Burgard,et al.  A real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[4]  Thierry Peynot,et al.  The Marulan Data Sets: Multi-sensor Perception in a Natural Environment with Challenging Conditions , 2010, Int. J. Robotics Res..

[5]  Wolfram Burgard,et al.  Gaussian Beam Processes: A Nonparametric Bayesian Measurement Model for Range Finders , 2007, Robotics: Science and Systems.

[6]  Mohammed El-Beltagy,et al.  Gaussian Processes for Model Fusion , 2001, ICANN.

[7]  Shrihari Vasudevan,et al.  Data fusion with Gaussian processes , 2012, Robotics Auton. Syst..

[8]  Thierry Peynot,et al.  Laser-to-radar sensing redundancy for resilient perception in adverse environmental conditions , 2012, ICRA 2012.

[9]  Geoffrey A. Hollinger,et al.  Active planning for underwater inspection and the benefit of adaptivity , 2012, Int. J. Robotics Res..

[10]  Marc Toussaint,et al.  Gaussian process implicit surfaces for shape estimation and grasping , 2011, 2011 IEEE International Conference on Robotics and Automation.

[11]  Mario Fernando Montenegro Campos,et al.  Efficient change detection in 3D environment for autonomous surveillance robots based on implicit volume , 2012, 2012 IEEE International Conference on Robotics and Automation.

[12]  Hugh F. Durrant-Whyte,et al.  Contextual occupancy maps incorporating sensor and location uncertainty , 2010, 2010 IEEE International Conference on Robotics and Automation.

[13]  J. Underwood,et al.  Towards reliable perception for Unmanned Ground Vehicles in challenging conditions , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  John J. Leonard,et al.  Robust Tracking for Real-Time Dense RGB-D Mapping with Kintinuous , 2012 .

[15]  Thierry Peynot,et al.  Error modeling and calibration of exteroceptive sensors for accurate mapping applications , 2010, J. Field Robotics.

[16]  Andrew Fitzgibbon,et al.  Gaussian Process Implicit Surfaces , 2006 .

[17]  James F. O'Brien,et al.  Variational Implicit Surfaces , 1999 .

[18]  Graham Brooker,et al.  Sensors for ranging and imaging , 2009 .

[19]  Hans-Peter Seidel,et al.  Multi-level partition of unity implicits , 2003, ACM Trans. Graph..