Metric for the Fusion of Synthetic and Real Imagery from Multimodal Sensors

We described a method to improve night vision for vehicle navigation that combined images from a thermal infrared camera and from a public database of stored images. Such an approach allows a night scene to appear as if it were daytime for automotive applications thereby increasing safety. We described a new metric to evaluate the fusion of such an augmented reality system and compared leading fusion algorithms to determine the efficacy of our approach.

[1]  Marc Schlipsing,et al.  Google Street View images support the development of vision-based driver assistance systems , 2012, 2012 IEEE Intelligent Vehicles Symposium.

[2]  Véronique Berge-Cherfaoui,et al.  DAARIA: Driver assistance by augmented reality for intelligent automobile , 2012, 2012 IEEE Intelligent Vehicles Symposium.

[3]  Balasubramanian Raman,et al.  Navigation and surveillance using night vision and image fusion , 2011, 2011 IEEE Symposium on Industrial Electronics and Applications.

[4]  Ehsan Ali,et al.  Enhancing thermal video using a public database of images , 2014, Sensing Technologies + Applications.

[5]  Jay A. Farrell,et al.  Real-Time Computer Vision/DGPS-Aided Inertial Navigation System for Lane-Level Vehicle Navigation , 2012, IEEE Transactions on Intelligent Transportation Systems.

[6]  David Gerónimo Gómez,et al.  Survey of Pedestrian Detection for Advanced Driver Assistance Systems , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Weihong Li,et al.  Robust pedestrian detection in thermal infrared imagery using the wavelet transform , 2010 .

[8]  Richard Bishop Intelligent Vehicle Applications Worldwide , 2000, IEEE Intell. Syst..