Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects

Touché proposes a novel Swept Frequency Capacitive Sensing technique that can not only detect a touch event, but also recognize complex configurations of the human hands and body. Such contextual information significantly enhances touch interaction in a broad range of applications, from conventional touchscreens to unique contexts and materials. For example, in our explorations we add touch and gesture sensitivity to the human body and liquids. We demonstrate the rich capabilities of Touché with five example setups from different application domains and conduct experimental studies that show gesture classification accuracies of 99% are achievable with our technology.

[1]  Xiang Cao,et al.  Grips and gestures on a multi-touch pen , 2011, CHI.

[2]  Jovan Popovic,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH '09.

[3]  Mike Sinclair,et al.  Touch-sensing input devices , 1999, CHI '99.

[4]  Marissa Díaz,et al.  Using water as interface media in VR applications , 2005, CLIHC '05.

[5]  Ivan Poupyrev,et al.  Sensing human activities with resonant tuning , 2010, CHI EA '10.

[6]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[7]  Tomoko Yonezawa,et al.  Tangible Sound: Musical Instrument Using Fluid Media , 2000, ICMC.

[8]  Joshua R. Smith Field Mice: Extracting Hand Geometry from Electric Field Measurements , 1996, IBM Syst. J..

[9]  Jefferson Y. Han,et al.  Submerging technologies , 2006, SIGGRAPH Sketches.

[10]  Joseph A. Paradiso,et al.  Swept-frequency, magnetically-coupled resonant tags for realtime, continuous, multiparameter control , 1999, CHI EA '99.

[11]  Munehiko Sato,et al.  Particle display system: a real world display with physically distributable pixels , 2008, CHI Extended Abstracts.

[12]  P. Hartvigsen The Computer for the 21st Century (1991) , 2014 .

[13]  Feng Wang,et al.  Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.

[14]  David Isaacson,et al.  Electrical Impedance Tomography , 1999, SIAM Rev..

[15]  Ali Israr,et al.  TeslaTouch: electrovibration for touch surfaces , 2010, UIST.

[16]  Mark Weiser The computer for the 21st century , 1991 .

[17]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[18]  Edward H. Adelson,et al.  deForm: an interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch , 2011, UIST.

[19]  J. Lewis,et al.  Pen‐on‐Paper Flexible Electronics , 2011, Advanced materials.

[20]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[21]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[22]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[23]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[24]  Ivan Poupyrev,et al.  Designing embodied interfaces for casual sound recording devices , 2008, CHI Extended Abstracts.

[25]  V. Michael Bove,et al.  Graspables: grasp-recognition as a user interface , 2009, CHI.

[26]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[27]  K R Foster,et al.  Whole-body impedance--what does it measure? , 1996, The American journal of clinical nutrition.

[28]  Kenneth C. Smith,et al.  A multi-touch three dimensional touch-sensitive tablet , 1985, CHI '85.

[29]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[30]  Richard Martin,et al.  Design for wearability , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[31]  Joseph A. Paradiso,et al.  Applying electric field sensing to human-computer interfaces , 1995, CHI '95.

[32]  John Zimmerman,et al.  The SenseChair: the lounge chair as an intelligent assistive device for elders , 2005, DUX '05.

[33]  Gary Barrett,et al.  Projected‐Capacitive Touch Technology , 2010 .

[34]  Hal Philipp,et al.  Charge transfer sensing , 1999 .

[35]  Patrick Baudisch,et al.  Modular and deformable touch-sensitive surfaces based on time domain reflectometry , 2011, UIST.

[36]  J. H. Maindonald,et al.  Ripening of Nectarine Fruit (Changes in the Cell Wall, Vacuole, and Membranes Detected Using Electrical Impedance Measurements) , 1994, Plant physiology.

[37]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[38]  Jun-ichiro Watanabe VortexBath: Study of Tangible Interaction with Water in Bathroom for Accessing and Playing Media Files , 2007, HCI.

[39]  Jun Rekimoto,et al.  GraspZoom: zooming and scrolling control model for single-handed mobile interaction , 2009, Mobile HCI.

[40]  Dinesh K. Pai,et al.  Grasp Recognition and Manipulation with the Tango , 2006, ISER.

[41]  Ivan Poupyrev,et al.  Tactile interfaces for small touch screens , 2003, UIST '03.

[42]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[43]  Masatoshi Ishikawa,et al.  Smart laser-scanner for 3D human-machine interface , 2005, CHI EA '05.

[44]  Ken Perlin,et al.  The UnMousePad: an interpolating multi-touch force-sensing input pad , 2009, SIGGRAPH 2009.

[45]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.