Understanding users' preferences for surface gestures

We compare two gesture sets for interactive surfaces---a set of gestures created by an end-user elicitation method and a set of gestures authored by three HCI researchers. Twenty-two participants who were blind to the gestures' authorship evaluated 81 gestures presented and performed on a Microsoft Surface. Our findings indicate that participants preferred gestures authored by larger groups of people, such as those created by end-user elicitation methodologies or those proposed by more than one researcher. This preference pattern seems to arise in part because the HCI researchers proposed more physically and conceptually complex gestures than end-users. We discuss our findings in detail, including the implications for surface gesture design.

[1]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[2]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[3]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[4]  John F. Hughes,et al.  Multi-finger cursor techniques , 2006, Graphics Interface.

[5]  H. Yanco,et al.  Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces , 2009, ITS '09.

[6]  Carl Gutwin,et al.  TNT: improved rotation and translation on digital tables , 2006, Graphics Interface.

[7]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[8]  Raimund Dachselt,et al.  Investigating multi-touch and pen gestures for diagram editing on interactive surfaces , 2009, ITS '09.

[9]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[10]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[11]  Saul Greenberg,et al.  Enabling interaction with single user applications through speech and gestures on a multi-user tabletop , 2006, AVI '06.

[12]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[13]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[14]  Terry Trickett,et al.  Design at Work , 1992 .

[15]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[16]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[17]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[18]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[19]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[20]  Meredith Ringel Morris,et al.  Experiences with and observations of direct-touch tabletops , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[21]  Meredith Ringel Morris,et al.  Barehands: implement-free interaction with a wall-mounted display , 2001, CHI Extended Abstracts.

[22]  Andreas Paepcke,et al.  Cooperative gestures: multi-user gestural interactions for co-located groupware , 2006, CHI.

[23]  Darren Leigh,et al.  Under the table interaction , 2006, UIST.