DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing

Several generations of inexpensive depth cameras have opened the possibility for new kinds of interaction on everyday surfaces. A number of research systems have demonstrated that depth cameras, combined with projectors for output, can turn nearly any reasonably flat surface into a touch-sensitive display. However, even with the latest generation of depth cameras, it has been difficult to obtain sufficient sensing fidelity across a table-sized surface to get much beyond a proof-of-concept demonstration. In this paper we present DIRECT, a novel touch-tracking algorithm that merges depth and infrared imagery captured by a commodity sensor. This yields significantly better touch tracking than from depth data alone, as well as any prior system. Further extending prior work, DIRECT supports arbitrary user orientation and requires no prior calibration or background capture. We describe the implementation of our system and quantify its accuracy through a comparison study of previously published, depth-based touch-tracking algorithms. Results show that our technique boosts touch detection accuracy by 15% and reduces positional error by 55% compared to the next best-performing technique.

[1]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Robert M. White,et al.  Comparative Anthropometry of the Hand , 1980 .

[3]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[4]  Robert Xiao,et al.  WorldKit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces , 2013, CHI.

[5]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[6]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[7]  D. Iwai,et al.  Touch sensing by image analysis of fingernail , 2008, 2008 SICE Annual Conference.

[8]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[9]  Feng Wang,et al.  Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.

[10]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[11]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[12]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[13]  Mark S. Young,et al.  Kodak's Ergonomic Design for People at Work , 2009 .

[14]  Eric C. Larson,et al.  Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras , 2012, 2012 IEEE International Conference on Emerging Signal Processing Applications.

[15]  Joseph A. Paradiso,et al.  Passive acoustic sensing for tracking knocks atop large interactive displays , 2002, Proceedings of IEEE Sensors.

[16]  Andrew D. Wilson Using a depth camera as a touch sensor , 2010, ITS '10.

[17]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[18]  Tohru Ishizaka,et al.  Segmentation of natural images using anisotropic diffusion and linking of boundary edges , 1998, Pattern Recognit..

[19]  Robert Xiao,et al.  Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation , 2014, MobileHCI '14.

[20]  Yoichi Sato,et al.  Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system , 2001, TCHI.

[21]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[22]  Pattie Maes,et al.  Flexpad: highly flexible bending interactions for projected handheld displays , 2013, CHI.

[23]  Robert Xiao,et al.  Estimating 3D Finger Angle on Commodity Touchscreens , 2015, ITS.

[24]  Joseph A. Paradiso,et al.  Sensor systems for interactive surfaces , 2000, IBM Syst. J..

[25]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[26]  W. Bu A MULTI-TOUCH THREE DIMENSIONAL TOUCH-SENSITIVE TABLET , 1985 .