Exploring the use of hand-to-face input for interacting with head-worn displays

We propose the use of Hand-to-Face input, a method to interact with head-worn displays (HWDs) that involves contact with the face. We explore Hand-to-Face interaction to find suitable techniques for common mobile tasks. We evaluate this form of interaction with document navigation tasks and examine its social acceptability. In a first study, users identify the cheek and forehead as predominant areas for interaction and agree on gestures for tasks involving continuous input, such as document navigation. These results guide the design of several Hand-to-Face navigation techniques and reveal that gestures performed on the cheek are more efficient and less tiring than interactions directly on the HWD. Initial results on the social acceptability of Hand-to-Face input allow us to further refine our design choices, and reveal unforeseen results: some gestures are considered culturally inappropriate and gender plays a role in selection of specific Hand-to-Face interactions. From our overall results, we provide a set of guidelines for developing effective Hand-to-Face interaction techniques.

[1]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[2]  Patrick Baudisch,et al.  Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing , 2013, CHI.

[3]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[4]  Sean White,et al.  Exploring the interaction design space for interactive glasses , 2013, CHI Extended Abstracts.

[5]  K. Hinckley Input technologies and techniques , 2002 .

[6]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[7]  Oleg Spakov,et al.  Enhanced gaze interaction using simple head gestures , 2012, UbiComp.

[8]  Desney S. Tan,et al.  Optically sensing tongue gestures for computer input , 2009, UIST '09.

[9]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS '12.

[10]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[11]  Marcos Serrano,et al.  Bezel-Tap gestures: quick activation of commands from sleep mode on tablets , 2013, CHI.

[12]  E. Heath Borg's Perceived Exertion and Pain Scales , 1998 .

[13]  Tovi Grossman,et al.  Implanted user interfaces , 2012, CHI.

[14]  Jun Rekimoto,et al.  BrainyHand: a wearable computing device without HMD and it's interaction techniques , 2010, AVI.

[15]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[16]  Shinji Kimura,et al.  Eyeglass-based hands-free videophone , 2013, ISWC '13.

[17]  Peter Robinson,et al.  Interpreting Hand-Over-Face Gestures , 2011, ACII.

[18]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[19]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST '12.

[20]  Max Mühlhäuser,et al.  EarPut: augmenting behind-the-ear devices for ear-based interaction , 2013, CHI Extended Abstracts.

[21]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[22]  Eric Lecolinet,et al.  Clutch-free panning and integrated pan-zoom control on touch-sensitive surfaces: the cyclostar approach , 2010, CHI.

[23]  Mark Nicas,et al.  A Study Quantifying the Hand-to-Face Contact Rate and Its Potential Application to Predicting Respiratory Tract Infection , 2008, Journal of occupational and environmental hygiene.

[24]  Sriram Subramanian,et al.  Putting your best foot forward: investigating real-world mappings for foot-based gestures , 2012, CHI.

[25]  Ellen Yi-Luen Do,et al.  Don't mind me touching my wrist: a case study of interacting with on-body technology in public , 2013, ISWC '13.

[26]  Pourang Irani,et al.  Consumed endurance: a metric to quantify arm fatigue of mid-air interactions , 2014, CHI.

[27]  Wendy E. Mackay,et al.  Body-centric design space for multi-surface interaction , 2013, CHI.

[28]  James R. Eagan,et al.  Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets , 2013, CHI.