Deep learning-based single-shot spatial frequency multiplexing composite fringe projection profilometry

Fringe projection profilometry (FPP) has been widely used in high-speed, dynamic, real-time three-dimensional (3D) shape measurements. How to recover the high-accuracy and high-precision 3D shape information by a single fringe pattern is our long-term goal in FPP. Traditional single-shot fringe projection measurement methods are difficult to achieve high-precision 3D shape measurement of isolated and complex surface objects due to the influence of object surface reflectivity and spectral aliasing. In order to break through the physical limits of the traditional methods, we apply deep convolutional neural networks to single-shot fringe projection profilometry. By combining physical models and data-driven, we demonstrate that the model generated by training an improved U-Net network can directly perform high-precision and unambiguous phase retrieval on a single-shot spatial frequency multiplexing composite fringe image while avoiding spectrum aliasing. Experiments show that our method can retrieve high-quality absolute 3D surfaces of objects only by projecting a single composite fringe image.

[1]  Liang Zhang,et al.  Fringe pattern analysis using deep learning , 2018, Advanced Photonics.

[2]  G. Sansoni,et al.  A 3D vision system based on one-shot projection and phase demodulation for fast profilometry , 2005 .

[3]  M. Takeda,et al.  Frequency-multiplex Fourier-transform profilometry: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations. , 1997, Applied optics.

[4]  Qian Chen,et al.  Phase shifting algorithms for fringe projection profilometry: A review , 2018, Optics and Lasers in Engineering.

[5]  Anand K. Asundi,et al.  Micro Fourier Transform Profilometry (μFTP): 3D shape measurement at 10, 000 frames per second , 2017, ArXiv.

[6]  Zhang Liang,et al.  High dynamic range 3D measurements with fringe projection profilometry: a review , 2018 .

[7]  Shijie Feng,et al.  Single-shot absolute 3D shape measurement with deep-learning-based color fringe projection profilometry. , 2020, Optics letters.

[8]  J. Zhong,et al.  Absolute phase-measurement technique based on number theory in multifrequency grating projection profilometry. , 2001, Applied optics.

[9]  C Guan,et al.  Composite structured light pattern for three-dimensional video. , 2003, Optics express.

[10]  Qian Chen,et al.  Real-time 3-D shape measurement with composite phase-shifting fringes and multi-view system. , 2016, Optics express.

[11]  X. Su,et al.  Fourier transform profilometry based on composite structured light pattern , 2007 .

[12]  Qican Zhang,et al.  Dynamic 3-D shape measurement method: A review , 2010 .

[13]  Zonghua Zhang,et al.  Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques , 2012 .

[14]  Anand Asundi,et al.  Temporal phase unwrapping using deep learning , 2019, Scientific Reports.

[15]  Qian Chen,et al.  Robust dynamic 3-D measurements with motion-compensated phase-shifting profilometry , 2018 .

[16]  Qican Zhang,et al.  Fourier transform profilometry based on a fringe pattern with two frequency components , 2006, International Commission for Optics.

[17]  Qian Chen,et al.  High-resolution real-time 360° 3D model reconstruction of a handheld object with fringe projection profilometry. , 2019, Optics letters.

[18]  Qian Chen,et al.  High-speed real-time 3D shape measurement based on adaptive depth constraint. , 2018, Optics express.

[19]  Sai Siva Gorthi,et al.  Fringe projection techniques: Whither we are? , 2010 .

[20]  Qian Chen,et al.  Deep-learning-enabled geometric constraints and phase unwrapping for single-shot absolute 3D shape measurement , 2020 .

[21]  Qi Hao,et al.  Dual-frequency pattern scheme for high-speed 3-D shape measurement. , 2010, Optics express.

[22]  Qican Zhang,et al.  Camera calibration with active phase target: improvement on feature detection and optimization. , 2013, Optics letters.