Designing user-, hand-, and handpart-aware tabletop interactions with the TouchID toolkit

Recent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which part of the hand, (2) which side of the hand, and (3) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its low-level programming model hinders the way developers could rapidly explore new kinds of user- and handpart-aware interactions. We contribute the TouchID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TouchID provides an easy-to-use event-driven API as well as higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TouchID's expressiveness by showing how we developed a suite of techniques that exploits knowledge of which handpart is touching the surface.

[1]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[2]  Desney S. Tan,et al.  Enhancing input on and above the interactive surface with muscle sensing , 2009, ITS '09.

[3]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[4]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[5]  Saul Greenberg,et al.  What caused that touch?: expressive interaction with a surface through fiduciary-tagged gloves , 2010, ITS '10.

[6]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[7]  Gudrun Klinker,et al.  A multitouch software architecture , 2008, NordiCHI.

[8]  Andreas Paepcke,et al.  TeamTag: exploring centralized versus replicated controls for co-located tabletop groupware , 2006, CHI.

[9]  Julien Letessier,et al.  Visual tracking of bare fingers for interactive surfaces , 2004, UIST '04.

[10]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[11]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[12]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[13]  Dominik Schmidt,et al.  IdLenses: dynamic personal areas on shared surfaces , 2010, ITS '10.

[14]  Gudrun Klinker,et al.  Shadow tracking on multi-touch tables , 2008, AVI '08.

[15]  Dominik Schmidt,et al.  HandsDown: hand-contour-based user identification for interactive surfaces , 2010, NordiCHI.

[16]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[17]  Meredith Ringel Morris,et al.  Beyond "social protocols": multi-user coordination policies for co-located groupware , 2004, CSCW.

[18]  Edward Tse,et al.  Programming for Multiple Touches and Multiple Users: A Toolkit for the DiamondTouch Hardware , 2003 .

[19]  Dan R. Olsen,et al.  Evaluating user interface systems research , 2007, UIST.

[20]  Saul Greenberg,et al.  Toolkits and interface creativity , 2007, Multimedia Tools and Applications.

[21]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[22]  Andreas Paepcke,et al.  Cooperative gestures: multi-user gestural interactions for co-located groupware , 2006, CHI.

[23]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[24]  M. Sheelagh T. Carpendale,et al.  Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques , 2007, CHI.

[25]  Meredith Ringel Morris,et al.  DiamondSpin: an extensible toolkit for around-the-table interaction , 2004, CHI.

[26]  Steven K. Feiner,et al.  Cross-dimensional gestural interaction techniques for hybrid immersive environments , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[27]  Enrico Costanza,et al.  TUIO: A Protocol for Table-Top Tangible User Interfaces , 2005 .

[28]  Grant A. Partridge,et al.  IdenTTop: a flexible platform for exploring identity-enabled surfaces , 2009, CHI Extended Abstracts.

[29]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[30]  Jovan Popovic,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH '09.

[31]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[32]  Michael Haller,et al.  LightTracker: An Open-Source Multitouch Toolkit , 2010, CIE.

[33]  Elisabeth André,et al.  Hand distinction for multi-touch tabletop interaction , 2009, ITS '09.

[34]  Maneesh Agrawala,et al.  FingerGlass: efficient multiscale interaction on multitouch screens , 2011, CHI.

[35]  Jr. Joseph J. LaViola,et al.  A Survey of Hand Posture and Gesture Recognition Techniques and Technology , 1999 .

[36]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[37]  Juan Pablo Hourcade,et al.  PyMT: a post-WIMP multi-touch user interface toolkit , 2009, ITS '09.

[38]  Ricardo Langner,et al.  Grids & guides: multi-touch layout and alignment tools , 2011, CHI.

[39]  Meredith Ringel Morris,et al.  Identity-Differentiating Widgets for Multiuser Interactive Surfaces , 2006, IEEE Computer Graphics and Applications.