Whack gestures: inexact and inattentive interaction with mobile devices

We introduce Whack Gestures, an inexact and inattentive interaction technique. This approach seeks to provide a simple means to interact with devices with minimal attention from the user -- in particular, without the use of fine motor skills or detailed visual attention (requirements found in nearly all conventional interaction techniques). For mobile devices, this could enable interaction without "getting it out," grasping, or even glancing at the device. This class of techniques is suitable for a small number of simple but common interactions that could be carried out in an extremely lightweight fashion without disrupting other activities. With Whack Gestures, users can interact by striking a device with the open palm or heel of the hand. We briefly discuss the development and use of a preliminary version of this technique and show that implementations with high accuracy and a low false positive rate are feasible.

[1]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[2]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[3]  Bernt Schiele,et al.  Smart-Its Friends: A Technique for Users to Easily Establish Connections between Smart Artefacts , 2001, UbiComp.

[4]  Daniel J. Barrett,et al.  An Introduction to Computerized Experience Sampling in Psychology , 2001 .

[5]  Sunny Consolvo,et al.  Using the Experience Sampling Method to Evaluate Ubicomp Applications , 2003, IEEE Pervasive Comput..

[6]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.

[7]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[8]  Ling Bao,et al.  A context-aware experience sampling tool , 2003, CHI Extended Abstracts.

[9]  Gregory D. Abowd,et al.  A gesture-based authentication scheme for untrusted public terminals , 2004, UIST '04.

[10]  Kent Larson,et al.  Activity Recognition in the Home Using Simple and Ubiquitous Sensors , 2004, Pervasive.

[11]  Eric Horvitz,et al.  BusyBody: creating and fielding personalized models of the cost of interruption , 2004, CSCW.

[12]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors , 2005, TCHI.

[13]  Gaetano Borriello,et al.  A Practical Approach to Recognizing Physical Activities , 2006, Pervasive.

[14]  Jonna Häkkilä,et al.  Tap input as an embedded interaction method for mobile devices , 2007, TEI.

[15]  Roderick Murray-Smith,et al.  Shoogle: excitatory multimodal interaction on mobile devices , 2007, CHI.

[16]  James A. Landay,et al.  Conducting In Situ Evaluations for and With Ubiquitous Computing Technologies , 2007, Int. J. Hum. Comput. Interact..