Exploring User Defined Gestures for Ear-Based Interactions

The human ear is highly sensitive and accessible, making it especially suitable for being used as an interface for interacting with smart earpieces or augmented glasses. However, previous works on ear-based input mainly address gesture sensing technology and researcher-designed gestures. This paper aims to bring more understandings of gesture design. Thus, for a user elicitation study, we recruited 28 participants, each of whom designed gestures for 31 smart device-related tasks. This resulted in a total of 868 gestures generated. Upon the basis of these gestures, we compiled a taxonomy and concluded the considerations underlying the participants' designs that also offer insights into their design rationales and preferences. Thereafter, based on these study results, we propose a set of user-defined gestures and share interesting findings. We hope this work can shed some light on not only sensing technologies of ear-based input, but also the interface design of future wearable interfaces.

[1]  Yang Zhang,et al.  ActiTouch: Robust Touch Detection for On-Skin AR/VR Interfaces , 2019, UIST.

[2]  Gierad Laput,et al.  SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin , 2016, CHI.

[3]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[4]  Buntarou Shizuki,et al.  CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals , 2017, UIST.

[5]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[6]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[7]  Ivan Poupyrev,et al.  Soli , 2016, ACM Trans. Graph..

[8]  Kening Zhu,et al.  Tripartite Effects: Exploring Users’ Mental Model of Mobile Gestures under the Influence of Operation, Handheld Posture, and Interaction Space , 2017, Int. J. Hum. Comput. Interact..

[9]  Joseph A. Paradiso,et al.  NailO: Fingernails as an Input Surface , 2015, CHI.

[10]  Matt Anderson,et al.  FreeDigiter: a contact-free device for gesture control , 2004, Eighth International Symposium on Wearable Computers.

[11]  Jean Vanderdonckt,et al.  A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies? , 2020, Conference on Designing Interactive Systems.

[12]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[13]  Max Mühlhäuser,et al.  EarPut: augmenting ear-worn devices for ear-based interaction , 2014, OZCHI.

[14]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[15]  Ian Oakley,et al.  Designing Socially Acceptable Hand-to-Face Input , 2018, UIST.

[16]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[17]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[18]  Chris Harrison,et al.  Cord input: an intuitive, high-accuracy, multi-degree-of-freedom input method for mobile devices , 2010, CHI.

[19]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[20]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[21]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[22]  M. McHugh Interrater reliability: the kappa statistic , 2012, Biochemia medica.

[23]  Michael Nebeling,et al.  User-Driven Design Principles for Gesture Representations , 2018, CHI.

[24]  Da-Yuan Huang,et al.  DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions , 2016, CHI.

[25]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[26]  VatavuRadu-Daniel A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments , 2013 .

[27]  Yuta Sugiura,et al.  EarTouch: turning the ear into an input surface , 2017, MobileHCI.

[28]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[29]  Suranga Nanayakkara,et al.  Thumb-In-Motion: Evaluating Thumb-to-Ring Microgestures for Athletic Activity , 2018, SUI.

[30]  Jürgen Steimle,et al.  More than touch: understanding how people use skin as an input surface for mobile computing , 2014, CHI.

[31]  Shengdong Zhao,et al.  Botential: Localizing On-Body Gestures by Measuring Electrical Signatures on the Human Skin , 2015, MobileHCI.

[32]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[33]  Michael Rohs,et al.  A Taxonomy of Microinteractions: Defining Microgestures Based on Ergonomic and Scenario-Dependent Requirements , 2011, INTERACT.

[34]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[35]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[36]  Antonio Krüger,et al.  Understanding Same-Side Interactions with Wrist-Worn Devices , 2016, NordiCHI.

[37]  Radu-Daniel Vatavu A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments , 2013, J. Ambient Intell. Smart Environ..

[38]  Radu-Daniel Vatavu,et al.  The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies , 2019, CHI.

[39]  Marcos Serrano,et al.  Exploring the use of hand-to-face input for interacting with head-worn displays , 2014, CHI.

[40]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[41]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[42]  Carlos Tejada,et al.  Bitey: an exploration of tooth click gestures for hands-free user interface control , 2016, MobileHCI.

[43]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[44]  Christina Boucher,et al.  Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.

[45]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[46]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS.

[47]  Tilbe Göksun,et al.  Hands as a Controller: User Preferences for Hand Specific On-Skin Gestures , 2017, Conference on Designing Interactive Systems.

[48]  Bodo Urban,et al.  EarFieldSensing: A Novel In-Ear Electric Field Sensing to Enrich Wearable Gesture Input through Facial Expressions , 2017, CHI.