Fusion between laser and stereo vision data for moving objects tracking in intersection like scenario

Using multiple sensors in the context of environment perception for autonomous vehicles is quite common these days. Perceived data from these sensors can be fused at different levels like: before object detection, after object detection and finally after tracking the moving objects. In this paper we detail our object detection level fusion between laser and stereo vision sensors as opposed to pre-detection or track level fusion. We use the output of our laser processing to get a list of objects with position and dynamic properties for each object. Similarly we use the stereo vision output of another team which consists of a list of detected objects with position and classification properties for each object. We use Bayesian fusion technique on objects of these two lists to get a new list of fused objects. This fused list of objects is further used in tracking phase to track moving objects in an intersection like scenario. The results obtained on data sets of INTERSAFE-2 demonstrator vehicle show that this fusion has improved data association and track management steps.

[1]  Trung-Dung Vu,et al.  Online Localization and Mapping with Moving Object Tracking in Dynamic Outdoor Environments , 2007, 2007 IEEE Intelligent Vehicles Symposium.

[2]  Hugh F. Durrant-Whyte,et al.  Simultaneous Localization, Mapping and Moving Object Tracking , 2007, Int. J. Robotics Res..

[3]  Alberto Elfes,et al.  Occupancy grids: a probabilistic framework for robot perception and navigation , 1989 .

[4]  Sergiu Nedevschi,et al.  Stereovision-Based Sensor for Intersection Assistance , 2009 .

[5]  Trung-Dung Vu,et al.  Online localization and mapping with moving objects detection in dynamic outdoor environments , 2009, 2009 IEEE 5th International Conference on Intelligent Computer Communication and Processing.

[6]  Trung-Dung Vu,et al.  Grid-based localization and local mapping with moving object detection and tracking , 2011, Inf. Fusion.

[7]  Don Ray Murray,et al.  Using Real-Time Stereo Vision for Mobile Robot Navigation , 2000, Auton. Robots.

[8]  Antonis A. Argyros,et al.  Fusion of laser and visual data for robot motion planning and collision avoidance , 2003, Machine Vision and Applications.

[9]  Paolo Fiorini,et al.  Navigating a Robotic Wheelchair in a Railway Station during Rush Hour , 1999, Int. J. Robotics Res..

[10]  Sergiu Nedevschi,et al.  On-board stereo sensor for intersection driving assistance architecture and specification , 2009, 2009 IEEE 5th International Conference on Intelligent Computer Communication and Processing.

[11]  Olivier Aycard,et al.  Low level data fusion of laser and monocular color camera using occupancy grid framework , 2010, 2010 11th International Conference on Control Automation Robotics & Vision.

[12]  Wolfram Burgard,et al.  Map building with mobile robots in dynamic environments , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[13]  Wolfram Burgard,et al.  Probabilistic Robotics (Intelligent Robotics and Autonomous Agents) , 2005 .