Evaluating a User-Elicited Gesture Set for Interactive Displays

Recently, many studies were conducted which focused on eliciting gestures from users in order to come up with gesture sets for surface computing. However, there are still many questions to clarify concerning the value of this method regarding to the usability of such gesture sets in real systems. In this work, we contribute a usability test of an implemented gesture set based on user suggested pen and hand gestures for node-link diagram editing on interactive displays. The results of the usability test gave valuable insight in how users interact spontaneously with such a gestural interface. In particular, we found that the methodology of eliciting gestures from users reveals what kinds of gestures users prefer but that it does not necessarily show how they are applied. Beyond that, we observed how participants differentiate between touch and pen within complex workflows.

[1]  Robert C. Zeleznik,et al.  Hands-on math: a page-based multi-touch and pen desktop for technical work and problem solving , 2010, UIST.

[2]  William Buxton,et al.  Manual deskterity: an exploration of simultaneous pen + touch direct input , 2010, CHI EA '10.

[3]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[4]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[5]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[6]  Dimitre Novatchev,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.

[7]  Raimund Dachselt,et al.  Diagram Editing on Interactive Displays Using Multi-touch and Pen Gestures , 2010, Diagrams.

[8]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[9]  Chris North,et al.  Understanding Multi-touch Manipulation for Surface Computing , 2009, INTERACT.

[10]  H. Yanco,et al.  Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces , 2009, ITS '09.

[11]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[12]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[13]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[14]  Raimund Dachselt,et al.  Investigating multi-touch and pen gestures for diagram editing on interactive surfaces , 2009, ITS '09.

[15]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[16]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[17]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[18]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[19]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.