User-defined gestures for connecting mobile phones, public displays, and tabletops

Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.

[1]  Patrick Baudisch,et al.  Stitching: pen gestures that span multiple displays , 2004, AVI.

[2]  Kori Inkpen Quinn,et al.  That one there! Pointing to establish device identity , 2002, UIST '02.

[3]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[4]  Youn-Kyung Lim,et al.  Contact-and-connect: designing new pairing interface for short distance wireless devices , 2009, CHI Extended Abstracts.

[5]  Roy Want,et al.  Gesture connect: facilitating tangible interaction with a flick of the wrist , 2007, TEI.

[6]  Gerd Kortuem,et al.  Supporting device discovery and spontaneous interaction with spatial references , 2009, Personal and Ubiquitous Computing.

[7]  Jun Rekimoto,et al.  HyperPalette: a hybrid computing environment for small computing devices , 2000, CHI Extended Abstracts.

[8]  Shahram Izadi,et al.  SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.

[9]  Robert Hardy,et al.  Touch & interact: touch-based interaction of mobile phones with displays , 2008, Mobile HCI.

[10]  Masanori Sugimoto,et al.  Toss-it: intuitive information transfer techniques for mobile devices , 2005, CHI EA '05.

[11]  Norbert A. Streitz,et al.  Connectables: dynamic coupling of displays for the flexible creation of shared workspaces , 2001, UIST '01.

[12]  Songwu Lu,et al.  Point&Connect: intention-based device pairing for mobile phone users , 2009, MobiSys '09.

[13]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[14]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[15]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[16]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[17]  Andrew D. Wilson,et al.  BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking , 2007, GI '07.

[18]  Michael Rohs,et al.  HoverFlow: expanding the design space of around-device interaction , 2009, Mobile HCI.

[19]  Guobin Shen,et al.  BeepBeep: a high accuracy acoustic ranging system using COTS mobile devices , 2007, SenSys '07.

[20]  Jun Rekimoto SyncTap: synchronous user operation for spontaneous network connection , 2004, Personal and Ubiquitous Computing.

[21]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[22]  Raimund Dachselt,et al.  Natural throw and tilt interaction between mobile phones and distant displays , 2009, CHI Extended Abstracts.

[23]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.