Time-of-flight sensor depth enhancement for automotive exhaust gas

The Time-of-Flight (ToF) sensor has been envisioned as a candidate of next generation sensors for intelligent vehicles. One of the problems in automotive environment is that the sensor outputs wrong values if exhaust gas exists in the scene. In this paper, we provide two new contributions to the signal processing aspects of the ToF sensor for automotive use. First, we present the sensor characteristics and models for exhaust gas to cope with them. Second, we develop a depth enhancement algorithm to reject the influence of exhaust gas from multiple images. Experimental results demonstrate the effectiveness of the depth enhancement algorithm for both static data (including ground truth) and on-vehicle data acquired by the sensor mounted on a car.

[1]  John J. Leonard,et al.  Kintinuous: Spatially Extended KinectFusion , 2012, AAAI 2012.

[2]  Sebastian Thrun,et al.  LidarBoost: Depth superresolution for ToF 3D shape scanning , 2009, CVPR.

[3]  Didier Stricker,et al.  CoRBS: Comprehensive RGB-D benchmark for SLAM using Kinect v2 , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[4]  R. Lange,et al.  Solid-state time-of-flight range camera , 2001 .

[5]  Alonzo Kelly,et al.  Experimental Characterization of Commercial Flash Ladar Devices , 2005 .

[6]  Christian Riess,et al.  A Comparative Error Analysis of Current Time-of-Flight Sensors , 2016, IEEE Transactions on Computational Imaging.

[7]  D. Falie,et al.  Wide range Time of Flight camera for outdoor surveillance , 2008, 2008 Microwaves, Radar and Remote Sensing Symposium.

[8]  A. Venetsanopoulos,et al.  Order statistics in digital image processing , 1992, Proc. IEEE.

[9]  Marc Levoy,et al.  Efficient variants of the ICP algorithm , 2001, Proceedings Third International Conference on 3-D Digital Imaging and Modeling.

[10]  Giancarlo Calvagno,et al.  Depth images super-resolution: An iterative approach , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[11]  Dani Lischinski,et al.  Joint bilateral upsampling , 2007, SIGGRAPH 2007.

[12]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[13]  Sebastian Thrun,et al.  3D shape scanning with a time-of-flight camera , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[14]  Sander Oude Elberink,et al.  Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications , 2012, Sensors.

[15]  Didier Stricker,et al.  DNA-SLAM: Dense Noise Aware SLAM for ToF RGB-D Cameras , 2016, ACCV Workshops.

[16]  Sergi Foix,et al.  Indoor and outdoor depth imaging of leaves with time-of-flight and stereo vision sensors: Analysis and comparison , 2014 .

[17]  Jörg Stückler,et al.  Super-resolution Keyframe Fusion for 3D Modeling with High-Quality Textures , 2015, 2015 International Conference on 3D Vision.

[18]  Didier Stricker,et al.  Augmented Reality 3D Discrepancy Check in Industrial Applications , 2016, 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[19]  Reinhard Koch,et al.  Time‐of‐Flight Cameras in Computer Graphics , 2010, Comput. Graph. Forum.

[20]  King Ngi Ngan,et al.  Temporal depth video enhancement based on intrinsic static structure , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[21]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .