3D interactive system based on vision computing of direct‐flective cameras

A bare-finger 3D interactive technology for portable devices was developed. Using directive-flective cameras to reform the field of viewing, a blind working range close to the camera is eliminated. Moreover, the algorithm of vision computing, different from skin color detection, is presented to determine the positions of fingertips. The interactive range is workable from 1.5 to 50 cm above the entire surface of the display. The mean position error of less than 1 cm is achieved. This accuracy realizes a camera-based 3D interactive system allowing for near-distance functionality. Therefore, floating 3D images can be touched and interacted with, potentially creating more application and intuitive user-machine interface.

[1]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Richard Szeliski,et al.  High-accuracy stereo depth maps using structured light , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[3]  Yi-Pai Huang,et al.  24.4: Distinguished Student Paper: Floating 3D Image for High Resolution Portable Device Using Integral Photography Theory , 2015 .

[4]  Brian Wyvill,et al.  Robust iso-surface tracking for interactive character skinning , 2014, ACM Trans. Graph..

[5]  Ah Chung Tsoi,et al.  Face recognition: a convolutional neural-network approach , 1997, IEEE Trans. Neural Networks.

[6]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[7]  Sang-Heon Lee,et al.  Smart TV interaction system using face and hand gesture recognition , 2013, 2013 IEEE International Conference on Consumer Electronics (ICCE).

[8]  Chongyang Ma,et al.  Facial performance sensing head-mounted display , 2015, ACM Trans. Graph..

[9]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[11]  Hiromi Kato,et al.  31.3: A System LCD with Integrated 3-Dimensional Input Device , 2010 .

[12]  Yi-Pai Huang,et al.  Camera free 3-dimensional virtual touch display with multi-user identification , 2012, 2012 19th IEEE International Conference on Image Processing.

[13]  Yi-Pai Huang,et al.  Bare finger 3D air-touch system with embedded multiwavelength optical sensor arrays for mobile 3D displays , 2013 .

[14]  David Sweeney,et al.  Learning to be a depth camera for close-range human capture and interaction , 2014, ACM Trans. Graph..

[15]  S. Burak Gokturk,et al.  A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.