Guidance image based method for real-time motion artefact handling on Time-of-Flight cameras

This paper presents a real-time approach to address the motion artefact related to the Time-of-Flight (ToF) camera's working principle. ToF cameras based on demodulation lock-in pixels estimate depth from the phase-shift between emitted and received modulated near-infrared (NIR) signals, in which four sequential phase-shifted images are required, i.e., four-taps technique. The ToF working principle assumes the scene to be motionless during this time interval. However and in practise, unreliable depth measurements arise along object boundaries in dynamic scenes, mainly when fast movements are involved. Herein, we propose a robust method to identify those pixels in the resulting depth map that are prominent to be unreliable. Then, we replace their values by the closest reliable ones using the guided filter (GF) and an accurate guidance image, generated from the previously acquired sequential phase-shifted images. The GF has been selected as it presents a better behaviour near edges than alternative edge preserving filters with a major advantage of being a fast and non-approximate linear time algorithm. The experimental evaluation shows that the proposed method satisfactory addresses the motion artefact, even in extreme conditions.

[1]  R. Lange,et al.  Solid-state time-of-flight range camera , 2001 .

[2]  Frederic Garcia,et al.  Real-time Visualization of High-Dynamic-Range Infrared Images based on Human Perception Characteristics - Noise Removal, Image Detail Enhancement and Time Consistency , 2015, VISAPP.

[3]  Roberto Manduchi,et al.  Bilateral filtering for gray and color images , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[4]  Stephan Hussmann,et al.  Real-time motion supression in TOF range images , 2010, 2010 IEEE Instrumentation & Measurement Technology Conference Proceedings.

[5]  Martial Hebert,et al.  3D measurements from imaging laser radars: how good are they? , 1992, Image Vis. Comput..

[6]  Frederic Garcia Becerro Sensor Fusion Combining 3-D and 2-D for Depth Data Enhancement , 2012 .

[7]  Seungkyu Lee,et al.  Time-of-Flight Depth Camera Motion Blur Detection and Deblurring , 2014, IEEE Signal Processing Letters.

[8]  Mirko Schmidt Analysis, Modeling and Dynamic Optimization of 3D Time-of-Flight Imaging Systems , 2011 .

[9]  Thomas Hoegg,et al.  Real-time motion artifacts compensation of ToF sensors data on GPU , 2013, Defense, Security, and Sensing.

[10]  Andreas Kolb,et al.  Compensation of Motion Artifacts for Time-of-Flight Cameras , 2009, Dyn3D.

[11]  Jian Sun,et al.  Guided Image Filtering , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Richard Szeliski,et al.  A Database and Evaluation Methodology for Optical Flow , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[13]  Peter Seitz,et al.  The lock-in CCD-two-dimensional synchronous detection of light , 1995 .

[14]  Marvin Lindner,et al.  Calibration and real-time processing of time-of-flight range data , 2010 .

[15]  Michael Lehmann,et al.  An all-solid-state optical range camera for 3D real-time imaging with sub-centimeter depth resolution (SwissRanger) , 2004, SPIE Optical Systems Design.

[16]  Franklin A. Graybill,et al.  Introduction to the Theory of Statistics, 3rd ed. , 1974 .

[17]  K. Creath V Phase-Measurement Interferometry Techniques , 1988 .

[18]  Martial Hebert,et al.  3-D measurements from imaging laser radars: how good are they? , 1991, Proceedings IROS '91:IEEE/RSJ International Workshop on Intelligent Robots and Systems '91.

[19]  Andreas Kolb,et al.  Real-Time Motion Artifact Compensation for PMD-ToF Images , 2013, Time-of-Flight and Depth Imaging.