User-defined gestures for surface computing

Many surface computing prototypes have employed gestures created by system designers. Although such gestures are appropriate for early investigations, they are not necessarily reflective of user behavior. We present an approach to designing tabletop gestures that relies on eliciting gestures from non-technical users by first portraying the effect of a gesture, and then asking users to perform its cause. In all, 1080 gestures from 20 participants were logged, analyzed, and paired with think-aloud data for 27 commands performed with 1 and 2 hands. Our findings indicate that users rarely care about the number of fingers they employ, that one hand is preferred to two, that desktop idioms strongly influence users' mental models, and that some commands elicit little gestural agreement, suggesting the need for on-screen widgets. We also present a complete user-defined gesture set, quantitative agreement scores, implications for surface technology, and a taxonomy of surface gestures. Our results will help designers create better gesture sets informed by user behavior.

[1]  Douglas Schuler,et al.  Participatory Design: Principles and Practices , 1993 .

[2]  James A. Landay,et al.  Implications for a gesture design tool , 1999, CHI '99.

[3]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[4]  Susan T. Dumais,et al.  The vocabulary problem in human-system communication , 1987, CACM.

[5]  Adam Kendon,et al.  How gestures can become like words , 1988 .

[6]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[7]  Daniel J. Wigdor,et al.  Multi-user, multi-display interaction with a single-user, single-display geospatial application , 2006, UIST.

[8]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[9]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[10]  Carl Gutwin,et al.  TNT: improved rotation and translation on digital tables , 2006, Graphics Interface.

[11]  J. P. Foley,et al.  Gesture and Environment , 1942 .

[12]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[13]  Nicla Rossini The Analysis of Gesture: Establishing a Set of Parameters , 2003, Gesture Workshop.

[14]  Isabella Poggi,et al.  From a Typology of Gestures to a Procedure for Gesture Production , 2001, Gesture Workshop.

[15]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[16]  Noëlle Carbonell,et al.  An experimental study of future “natural” multimodal human-computer interaction , 1993, CHI '93.

[17]  Dennis R. Wixon,et al.  Building a user-derived interface , 1984, CACM.

[18]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[19]  Saul Greenberg,et al.  Enabling interaction with single user applications through speech and gestures on a multi-user tabletop , 2006, AVI '06.

[20]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[21]  John F. Hughes,et al.  Multi-finger cursor techniques , 2006, Graphics Interface.

[22]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[23]  Claudio S. Pinhanez,et al.  A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment , 2005, CHI.

[24]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[25]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[26]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[27]  Darren Leigh,et al.  Under the table interaction , 2006, UIST.

[28]  Nicole Beringer Evoking Gestures in SmartKom - Design of the Graphical User Interface , 2001, Gesture Workshop.

[29]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[30]  M. Carter Computer graphics: Principles and practice , 1997 .

[31]  Noëlle Carbonell,et al.  Expression constraints in multimodal human-computer interaction , 2000, IUI '00.

[32]  An empirical study of speech and gesture interaction: toward the definition of ergonomic design guidelines , 1998, CHI Conference Summary.

[33]  Otmar Hilliges,et al.  Bringing physics to the surface , 2008, UIST '08.

[34]  Andreas Paepcke,et al.  Cooperative gestures: multi-user gestural interactions for co-located groupware , 2006, CHI.

[35]  John C. Tang Findings from Observational Studies of Collaborative Work , 1991, Int. J. Man Mach. Stud..