TapSense: enhancing finger interaction on touch surfaces

We present TapSense, an enhancement to touch interaction that allows conventional surfaces to identify the type of object being used for input. This is achieved by segmenting and classifying sounds resulting from an object's impact. For example, the diverse anatomy of a human finger allows different parts to be recognized including the tip, pad, nail and knuckle - without having to instrument the user. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, where input is extremely constrained. Our system can also identify different sets of passive tools. We conclude with a comprehensive investigation of classification accuracy and training implications. Results show our proof-of-concept system can support sets with four input types at around 95% accuracy. Small, but useful input sets of two (e.g., pen and finger discrimination) can operate in excess of 99% accuracy.

[1]  Ravin Balakrishnan,et al.  Sphere: multi-touch interactions on a spherical display , 2008, UIST '08.

[2]  Joseph A. Paradiso,et al.  Sensor systems for interactive surfaces , 2000, IBM Syst. J..

[3]  Joseph A. Paradiso,et al.  Tracking and characterizing knocks atop large interactive displays , 2005 .

[4]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[5]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[6]  Thad Starner,et al.  Hambone: A Bio-Acoustic Gesture Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[7]  Masataka Goto,et al.  Real-time sound source localization and separation system and its application to automatic speech recognition , 2001, INTERSPEECH.

[8]  Björn Hartmann,et al.  Augmenting interactive tables with mice & keyboards , 2009, UIST '09.

[9]  Carl Gutwin,et al.  Supporting Informal Collaboration in Shared-Workspace Groupware , 2008, J. Univers. Comput. Sci..

[10]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[11]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[12]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[13]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[14]  Johannes Taelman,et al.  Dip - it: digital infrared painting on an interactive table , 2008, CHI Extended Abstracts.

[15]  Alex Olwal,et al.  SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces , 2008, Graphics Interface.

[16]  Ken Hinckley,et al.  Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.

[17]  Feng Wang,et al.  Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.

[18]  Chris Harrison,et al.  Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces , 2008, UIST '08.

[19]  William Buxton,et al.  Manual deskterity: an exploration of simultaneous pen + touch direct input , 2010, CHI EA '10.

[20]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[21]  Darren Leigh,et al.  DT controls: adding identity to physical interfaces , 2005, UIST '05.

[22]  Jun Rekimoto,et al.  ToolStone: effective use of the physical manipulation vocabularies of input devices , 2000, UIST '00.

[23]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[24]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[25]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[26]  Hiroshi Ishii,et al.  Sensetable: a wireless object tracking platform for tangible user interfaces , 2001, CHI.