Exploring Non-touchscreen Gestures for Smartwatches

Although smartwatches are gaining popularity among mainstream consumers, the input space is limited due to their small form factor. The goal of this work is to explore how to design non-touchscreen gestures to extend the input space of smartwatches. We conducted an elicitation study eliciting gestures for 31 smartwatch tasks. From this study, we demonstrate that a consensus exists among the participants on the mapping of gesture to command and use this consensus to specify a user-defined gesture set. Using gestures collected during our study, we define a taxonomy describing the mapping and physical characteristics of the gestures. Lastly, we provide insights to inform the design of non-touchscreen gestures for smartwatch interaction.

[1]  Donald A. Norman,et al.  User Centered System Design: New Perspectives on Human-Computer Interaction , 1988 .

[2]  Sungjae Hwang,et al.  BandSense: Pressure-sensitive Multi-touch Interaction on a Wristband , 2015, CHI Extended Abstracts.

[3]  Anne Köpsel,et al.  Benefiting from legacy bias , 2015, Interactions.

[4]  Blaine A. Price,et al.  Wearables: has the age of smartwatches finally arrived? , 2015, Commun. ACM.

[5]  Elena Mugellini,et al.  A Smart Watch with Embedded Sensors to Recognize Objects, Grasps and Forearm Gestures , 2012 .

[6]  Otmar Hilliges,et al.  In-air gestures around unmodified mobile devices , 2014, UIST.

[7]  Jan Gulliksen,et al.  User-centered System Design , 2011 .

[8]  Radu-Daniel Vatavu A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments , 2013, J. Ambient Intell. Smart Environ..

[9]  Roope Raisamo,et al.  Glance Awareness and Gaze Interaction in Smartwatches , 2015, CHI Extended Abstracts.

[10]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[11]  Gierad Laput,et al.  Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click , 2014, CHI.

[12]  Wei-Hung Chen,et al.  Blowatch: Blowable and Hands-free Interaction for Smartwatches , 2015, CHI Extended Abstracts.

[13]  Ian Oakley,et al.  Interaction on the edge: offset sensing for small devices , 2014, CHI.

[14]  Xiang 'Anthony' Chen,et al.  Air+touch: interweaving touch & in-air gestures , 2014, UIST.

[15]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[16]  Sriram Subramanian,et al.  Would you do that?: understanding social acceptance of gestural interfaces , 2010, Mobile HCI.

[17]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[18]  Sarah Morrison-Smith,et al.  Exploring User-Defined Back-Of-Device Gestures for Mobile Devices , 2015, MobileHCI.

[19]  David Coyle,et al.  Extending interaction for smart watches: enabling bimanual around device control , 2014, CHI Extended Abstracts.

[20]  Kent Lyons,et al.  The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[21]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[22]  Wendy E. Mackay,et al.  CHI '13 Extended Abstracts on Human Factors in Computing Systems , 2013, CHI 2013.

[23]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[24]  Suranga Nanayakkara,et al.  zSense: Enabling Shallow Depth Gesture Recognition for Greater Input Expressivity on Smart Wearables , 2015, CHI.

[25]  Takahiro Tanaka,et al.  CHI '14 Extended Abstracts on Human Factors in Computing Systems , 2014 .

[26]  Jonathan Randall Howarth,et al.  Cultural similarities and differences in user-defined gestures for touchscreen user interfaces , 2010, CHI EA '10.

[27]  Michael Muller,et al.  Grounded Theory Method in Human–Computer Interaction and Computer-Supported Cooperative Work , 2012 .

[28]  Sriram Subramanian,et al.  Would you do that?: understanding social acceptance of gestural interfaces , 2010, Mobile HCI.

[29]  Michael Rohs,et al.  HoverFlow: expanding the design space of around-device interaction , 2009, Mobile HCI.

[30]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[31]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[32]  Michita Imai,et al.  SkinWatch: skin gesture interaction for smart watch , 2015, AH.

[33]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[34]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[35]  Johannes Schöning,et al.  WatchMe: A Novel Input Method Combining a Smartwatch and Bimanual Interaction , 2015, CHI Extended Abstracts.

[36]  Lim Yan Peng,et al.  Guessability Study on Considering Cultural Values in Gesture Design for Different User Interfaces , .

[37]  Luís Carriço,et al.  Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services , 2010, MobileHCI 2010.

[38]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[39]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[40]  Stephen A. Brewster,et al.  Towards usable and acceptable above-device interactions , 2014, MobileHCI '14.