Gestures for Smart Rings: Empirical Results, Insights, and Design Implications

We present empirical results about users' gesture preferences for smart rings by analyzing 672 gestures from 24 participants. We report an overall low consensus (mean .112, maximum .225 on the unit scale) between participants' gesture proposals, and we point to the challenges of designing highly-generalizable ring gestures across users. We also contribute to the practice of gesture elicitation studies by discussing how a priori conditions (e.g., participants' traits, such as creativity and motor skills), commitment and behavior during the experiment (e.g., their thinking times), but also a posteriori aspects (the experimenter's choice of criteria to group gestures into categories) affect agreement. We offer design guidelines for ring gestures informed by our empirical observations, and present a collection of gestures reflective of our participants' mental models for effecting commands using smart rings.

[1]  Jonna Häkkilä,et al.  Ring form factor: a design space for interaction , 2017, SEMWEB.

[2]  Laurent Grisoni,et al.  Towards Many Gestures to One Command: A User Study for Tabletops , 2013, INTERACT.

[3]  Sahin Albayrak,et al.  eRing: multiple finger gesture recognition with one ring using an electric field , 2015, iWOAR.

[4]  Radu-Daniel Vatavu,et al.  There's a world outside your TV: exploring interactions beyond the physical TV screen , 2013, EuroITV.

[5]  Shahin Tajik,et al.  Multi-sensor Based Gestures Recognition with a Smart Finger Ring , 2014, HCI.

[6]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS.

[7]  Schubert Foo,et al.  Subtle, Natural and Socially Acceptable Interaction Techniques for Ringterfaces - Finger-Ring Shaped User Interfaces , 2013, HCI.

[8]  Eric Hand,et al.  Lord of the rings. , 2017, Science.

[9]  J. Bergold,et al.  A Methodological Approach in Motion , 2012 .

[10]  Daniel Vogel,et al.  Soft-Constraints to Reduce Legacy and Performance Bias to Elicit Whole-body Gestures with Low Arm Fatigue , 2015, CHI.

[11]  Yoshifumi Kitamura,et al.  A Collaborative Interface for the IllusionHole using a Control-Ring and a Set of Mice , 2006, 3D User Interfaces (3DUI'06).

[12]  Meredith Ringel Morris,et al.  Web on the wall: insights from a multimodal interaction elicitation study , 2012, ITS.

[13]  Michael Rohs,et al.  The $3 recognizer: simple 3D gesture recognition on mobile devices , 2010, IUI '10.

[14]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[15]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[16]  K. Tsukada,et al.  UBI-FINGER: A SIMPLE GESTURE INPUT DEVICE FOR MOBILE AND UBIQUITOUS ENVIRONMENT , 2003 .

[17]  Mark Weiser,et al.  The computer for the 21st Century , 1991, IEEE Pervasive Computing.

[18]  Qian Han,et al.  Frictio: Passive Kinesthetic Force Feedback for Smart Ring Output , 2017, UIST.

[19]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[20]  Hirotaka Osawa,et al.  iRing: intelligent ring using infrared reflection , 2012, UIST.

[21]  Davide Bolchini,et al.  Motor-Intuitive Interactions Based on Image Schemas: Aligning Touchless Interaction Primitives with Human Sensorimotor Abilities , 2015, Interact. Comput..

[22]  Gregory D. Abowd,et al.  FingOrbits: interaction with wearables using synchronized thumb movements , 2017, SEMWEB.

[23]  Ken Hinckley,et al.  LightRing: always-available 2D input on any surface , 2014, UIST.

[24]  Ki-Woong Park,et al.  A Ubiquitous Fashionable Computer with an i-Throw Device on a Location-Based Service Environment , 2007, 21st International Conference on Advanced Information Networking and Applications Workshops (AINAW'07).

[25]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[26]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[27]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[28]  Radu-Daniel Vatavu,et al.  The impact of motion dimensionality and bit cardinality on the design of 3D gesture recognizers , 2013, Int. J. Hum. Comput. Stud..

[29]  Laurent Grisoni,et al.  Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation , 2014, ICMI.

[30]  John O. Willis,et al.  NEPSY: A Developmental Neuropsychological Assessment , 2008 .

[31]  Yoshinobu Tonomura,et al.  “Body coupled FingerRing”: wireless wearable keyboard , 1997, CHI.

[32]  Yang Cao,et al.  Ringteraction: Coordinated Thumb-index Interaction Using a Ring , 2016, CHI Extended Abstracts.

[33]  John Maxymuk,et al.  The Encyclopedia of Fantasy , 1997 .

[34]  J. Bergold,et al.  Participatory Research Methods: A Methodological Approach in Motion , 2012 .

[35]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[36]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[37]  Finn Kensing,et al.  Participatory Design: Issues and Concerns , 2004, Computer Supported Cooperative Work (CSCW).

[38]  Radu-Daniel Vatavu,et al.  Multi-Level Representation of Gesture as Command for Human Computer Interaction , 2008, Comput. Informatics.

[39]  Hai-Ning Liang,et al.  User-defined surface+motion gestures for 3d manipulation of objects at a distance through a mobile device , 2012, APCHI '12.

[40]  Radu-Daniel Vatavu,et al.  On free-hand TV control: experimental results on user-elicited gestures with Leap Motion , 2015, Personal and Ubiquitous Computing.

[41]  Yiqiang Chen,et al.  A ring-shaped interactive device for large remote display and mobile device control , 2011, UbiComp '11.

[42]  Rajkumar Darbar RingIoT: A Smart Ring Controlling Things in Physical Spaces , 2017 .

[43]  M. Kendall,et al.  The Problem of $m$ Rankings , 1939 .

[44]  Radu-Daniel Vatavu,et al.  Smart-Pockets: Body-deictic gestures for fast access to personal data during ambient interactions , 2017, Int. J. Hum. Comput. Stud..

[45]  Eva Hornecker,et al.  Modifying Gesture Elicitation: Do Kinaesthetic Priming and Increased Production Reduce Legacy Bias? , 2016, Tangible and Embedded Interaction.

[46]  Bongwon Suh,et al.  OctaRing: Examining Pressure-Sensitive Multi-Touch Input on a Finger Ring Device , 2016, UIST.

[47]  Suranga Nanayakkara,et al.  Digital Digits , 2015, ACM Comput. Surv..

[48]  Junbo Wang,et al.  Magic Ring: a self-contained gesture input device on finger , 2013, MUM.

[49]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[50]  Wei-Tek Tsai,et al.  Personalized gesture interactions for cyber-physical smart-home environments , 2015, Science China Information Sciences.

[51]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[52]  James F. Allen Maintaining knowledge about temporal intervals , 1983, CACM.

[53]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[54]  Gregory D. Abowd,et al.  FingerSound , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[55]  Steve Mann,et al.  Wearable Computing: A First Step Toward Personal Imaging , 1997, Computer.

[56]  Radu-Daniel Vatavu,et al.  Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.

[57]  Radu-Daniel Vatavu,et al.  Between-Subjects Elicitation Studies: Formalization and Tool Support , 2016, CHI.

[58]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[59]  Eileen Cooper,et al.  A Critique of Six Measures for Assessing Creativity , 1991 .

[60]  Radu-Daniel Vatavu,et al.  Improving Gesture Recognition Accuracy on Touch Screens for Users with Low Vision , 2017, CHI.

[61]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[62]  Pedro Lopes,et al.  Proprioceptive Interaction , 2015, CHI.

[63]  Lisa Anthony,et al.  A lightweight multistroke recognizer for user interface prototypes , 2010, Graphics Interface.

[64]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.