Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces

Multi-touch technologies hold much promise for the command and control of mobile robot teams. To improve the ease of learning and usability of these interfaces, we conducted an experiment to determine the gestures that people would naturally use, rather than the gestures they would be instructed to use in a pre-designed system. A set of 26 tasks with differing control needs were presented sequentially on a DiamondTouch to 31 participants. We found that the task of controlling robots exposed unique gesture sets and considerations not previously observed, particularly in desktop-like applications. In this paper, we present the details of these findings, a taxonomy of the gesture set, and guidelines for designing gesture sets for robot control.

[1]  A. Strauss,et al.  The Discovery of Grounded Theory , 1967 .

[2]  K. A. Ericsson,et al.  Verbal reports as data. , 1980 .

[3]  th Congress,et al.  National Defense Authorization Act for Fiscal Year 1999 , 1999 .

[4]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[5]  A. Lancaster,et al.  Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces , 2004, IEEE Transactions on Professional Communication.

[6]  Steven K. Feiner,et al.  Collaborative mixed reality visualization of an archaeological excavation , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[7]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[8]  Alan Esenther,et al.  RemoteDT: Support for Multi-Site Table Collaboration , 2006 .

[9]  Saul Greenberg,et al.  Enabling interaction with single user applications through speech and gestures on a multi-user tabletop , 2006, AVI '06.

[10]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[11]  Henk J. Scholten,et al.  Tangible User Interfaces in Order to Improve Collaborative Interactions and Decision-Making during Disaster Management , 2007, Prehospital and Disaster Medicine.

[12]  Johannes Schöning,et al.  Improving interaction with virtual globes through spatial thinking: helping users ask "why?" , 2008, IUI '08.

[13]  Kris Luyten,et al.  Eunomia: toward a framework for multi-touch information displays in public spaces , 2008, BCS HCI.

[14]  Antonio Krüger,et al.  Multi-Modal Navigation through Spatial Information , 2008 .

[15]  Robert Regal,et al.  Extreme C2 and Multi-Touch, Multi-User Collaborative User Interfaces , 2008 .

[16]  Mark D. Dunlop,et al.  Dynamic positioning systems: usability and interaction styles , 2008, NordiCHI.

[17]  Nuno Correia,et al.  Flood Emergency Interaction and Visualization System , 2008, VISUAL.

[18]  Hanna Koskinen,et al.  Hands-on the process control: users preferences and associations on hand movements , 2008, CHI Extended Abstracts.

[19]  Daisuke Sakamoto,et al.  CRISTAL, control of remotely interfaced systems using touch-based actions in living spaces , 2009, SIGGRAPH '09.

[20]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[21]  Takeo Igarashi,et al.  Multi-touch interface for controlling multiple mobile robots , 2009, CHI Extended Abstracts.

[22]  Takeo Igarashi,et al.  Sketch and run: a stroke-based interface for home robots , 2009, CHI.

[23]  Holly A. Yanco,et al.  Multi-touch interaction for robot control , 2009, IUI.