Effects of Offset Pixel Aperture Width on the Performances of Monochrome CMOS Image Sensors for Depth Extraction

This paper presents the effects of offset pixel aperture width on the performance of monochrome (MONO) CMOS image sensors (CISs) for a three-dimensional image sensor. Using a technique to integrate the offset pixel aperture (OPA) inside each pixel, the depth information can be acquired using a disparity from OPA patterns. The OPA is classified into two pattern types: Left-offset pixel aperture (LOPA) and right-offset pixel aperture (ROPA). These OPAs are divided into odd and even rows and integrated in a pixel array. To analyze the correlation between the OPA width and the sensor characteristics, experiments were conducted by configuring the test elements group (TEG) regions. The OPA width of the TEG region for the measurement varied in the range of 0.3–0.5 μm. As the aperture width decreased, the disparity of the image increased, while the sensitivity decreased. It is possible to acquire depth information by the disparity obtained from the proposed MONO CIS using the OPA technique without an external light source. Therefore, the proposed MONO CIS with OPA could easily be applied to miniaturized devices. The proposed MONO CIS was designed and manufactured using the 0.11 μm CIS process.

[1]  Gordon Wetzstein,et al.  The light field stereoscope , 2015, ACM Trans. Graph..

[2]  Dah-Jye Lee,et al.  Review of stereo vision algorithms and their suitability for resource-limited systems , 2013, Journal of Real-Time Image Processing.

[3]  Jang-Kyoo Shin,et al.  Analysis of Disparity Information for Depth Extraction Using CMOS Image Sensor with Offset Pixel Aperture Technique † , 2019, Sensors.

[4]  In-So Kweon,et al.  Accurate depth map estimation from a lenslet light field camera , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Andrew D. Payne,et al.  A 0.13 μm CMOS System-on-Chip for a 512 × 424 Time-of-Flight Image Sensor With Multi-Frequency Photo-Demodulation up to 130 MHz and 2 GS/s ADC , 2015, IEEE Journal of Solid-State Circuits.

[6]  In-So Kweon,et al.  Geometric Calibration of Micro-Lens-Based Light Field Cameras Using Line Features , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Masaru Ogawa,et al.  A 0.18-$\mu$ m CMOS SoC for a 100-m-Range 10-Frame/s 200 $\,\times\,$96-Pixel Time-of-Flight Depth Sensor , 2014, IEEE Journal of Solid-State Circuits.

[8]  Shoji Kawahito,et al.  [Papers] Separation of Multi-path Components in Sweep-less Time-of-flight Depth Imaging with a Temporally-compressive Multi-aperture Image Sensor , 2018 .

[9]  W J Yun,et al.  Depth extraction with offset pixels. , 2018, Optics express.

[10]  W. Brockherde,et al.  CMOS Imager With 1024 SPADs and TDCs for Single-Photon Timing and 3-D Time-of-Flight , 2014, IEEE Journal of Selected Topics in Quantum Electronics.

[11]  Jang-Kyoo Shin,et al.  CMOS image sensor for extracting depth information using offset pixel aperture technique , 2017, Optical Engineering + Applications.

[12]  Leandro dos Santos Coelho,et al.  A Conceptual Model of a Stereo Vision System to Aid a Teleoperated Robot in Pruning Vegetation Close to Overhead Urban Power Lines , 2018, 2018 International Symposium on Power Electronics, Electrical Drives, Automation and Motion (SPEEDAM).

[13]  Sven Wanner,et al.  Variational Light Field Analysis for Disparity Estimation and Super-Resolution , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Marc Pollefeys,et al.  Reactive avoidance using embedded stereo vision for MAV flight , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[15]  Yu Wang,et al.  Real-Time High-Quality Stereo Vision System in FPGA , 2013, IEEE Transactions on Circuits and Systems for Video Technology.

[16]  Andreas Kolb,et al.  Kinect range sensing: Structured-light versus Time-of-Flight Kinect , 2015, Comput. Vis. Image Underst..

[17]  Massimo Bertozzi,et al.  GOLD: a parallel real-time stereo vision system for generic obstacle and lane detection , 1998, IEEE Trans. Image Process..

[18]  Jong-Wook Jang,et al.  Kinect depth sensor for computer vision applications in autonomous vehicles , 2017, 2017 Ninth International Conference on Ubiquitous and Future Networks (ICUFN).

[19]  Kim Sang‐Hwan,et al.  Effects of Aperture Width on the Performance of Monochrome CMOS Image Sensor Using Offset Pixel Aperture Technique for Depth Extraction , 2018 .

[20]  Jitendra Malik,et al.  Depth Estimation and Specular Removal for Glossy Surfaces Using Point and Line Consistency with Light-Field Cameras , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Alexei A. Efros,et al.  Occlusion-Aware Depth Estimation Using Light-Field Cameras , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[22]  Fabrizio Argenti,et al.  Wide-angle and long-range real time pose estimation: A comparison between monocular and stereo vision systems , 2017, J. Vis. Commun. Image Represent..

[23]  David Sir Brewster,et al.  The Stereoscope; Its History, Theory, and Construction, with Its Application to the Fine and Useful Arts and to Education , 2007 .