WRIST: Watch-Ring Interaction and Sensing Technique for Wrist Gestures and Macro-Micro Pointing

To better explore the incorporation of pointing and gesturing into ubiquitous computing, we introduce WRIST, an interaction and sensing technique that leverages the dexterity of human wrist motion. WRIST employs a sensor fusion approach which combines inertial measurement unit (IMU) data from a smartwatch and a smart ring. The relative orientation difference of the two devices is measured as the wrist rotation that is independent from arm rotation, which is also position and orientation invariant. Employing our test hardware, we demonstrate that WRIST affords and enables a number of novel yet simplistic interaction techniques, such as (i) macro-micro pointing without explicit mode switching and (ii) wrist gesture recognition when the hand is held in different orientations (e.g., raised or lowered). We report on two studies to evaluate the proposed techniques and we present a set of applications that demonstrate the benefits of WRIST. We conclude with a discussion of the limitations and highlight possible future pathways for research in pointing and gesturing with wearable devices.

[1]  Wei-Hung Chen,et al.  Blowatch: Blowable and Hands-free Interaction for Smartwatches , 2015, CHI Extended Abstracts.

[2]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[3]  Olivier Chapuis,et al.  Mid-Air Pointing on Ultra-Walls , 2015, TCHI.

[4]  Da-Yuan Huang,et al.  DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions , 2016, CHI.

[5]  Kellogg S. Booth,et al.  Mid-air text input techniques for very large wall displays , 2009, Graphics Interface.

[6]  Gregory D. Abowd,et al.  FingerSound , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[7]  Mathieu Raynal,et al.  Modeless Pointing with Low-Precision Wrist Movements , 2013, INTERACT.

[8]  Krzysztof Pietroszek,et al.  TickTockRay: smartwatch-based 3D pointing for smartphone-based virtual reality , 2016, VRST.

[9]  Doug A. Bowman,et al.  A human motor behavior model for distal pointing tasks , 2010, Int. J. Hum. Comput. Stud..

[10]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[11]  Stephen A. Brewster,et al.  Wrist rotation for interaction in mobile contexts , 2008, Mobile HCI.

[12]  Nicolas Roussel,et al.  1 € filter: a simple speed-based low-pass filter for noisy input in interactive systems , 2012, CHI.

[13]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[14]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[15]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[16]  Daniel J. Wigdor,et al.  TiltText: using tilt for text input to mobile phones , 2003, UIST '03.

[17]  Gierad Laput,et al.  ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers , 2016, UIST.

[18]  Jürgen Steimle,et al.  DeformWear , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[19]  Daniel Vogel,et al.  RotoSwype: Word-Gesture Typing using a Ring , 2019, CHI.

[20]  Mike Fraser,et al.  SensIR: Detecting Hand Gestures with a Wearable Bracelet using Infrared Transmission and Reflection , 2017, UIST.

[21]  Tim Paek,et al.  Exploring tilt for no-touch, wrist-only interactions on smartwatches , 2016, MobileHCI.

[22]  Mike Y. Chen,et al.  BackHand: Sensing Hand Gestures via Back of the Hand , 2015, UIST.

[23]  Mathieu Nancel,et al.  Myopoint: Pointing and Clicking Using Forearm Mounted Electromyography and Inertial Motion Sensors , 2015, CHI.

[24]  Jun Gong,et al.  WristWhirl: One-handed Continuous Smartwatch Input using Wrist Gestures , 2016, UIST.

[25]  M. Sheelagh T. Carpendale,et al.  A comparison of ray pointing techniques for very large displays , 2010, Graphics Interface.

[26]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[27]  Robert Xiao,et al.  Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions , 2015, ICMI.

[28]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[29]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[30]  S. McPhee Functional hand evaluations: a review. , 1987, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[31]  Edward Lank,et al.  Watchpoint: Freehand Pointing with a Smartwatch in a Ubiquitous Display Environment , 2016, AVI.

[32]  Edward Lank,et al.  Pointing at a Distance with Everyday Smart Devices , 2018, CHI.

[33]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[34]  Yang Zhang,et al.  Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition , 2015, UIST.

[35]  Roel Vertegaal,et al.  With a flick of the wrist: stretch sensors as lightweight input for mobile devices , 2012, Tangible and Embedded Interaction.

[36]  Jon Froehlich,et al.  TouchCam , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[37]  Patrick Olivier,et al.  Expressy: Using a Wrist-worn Inertial Measurement Unit to Add Expressiveness to Touch-based Interactions , 2016, CHI.

[38]  Aaron J. Quigley,et al.  Proximal and distal selection of widgets: designing distributed UI for mobile interaction with large display , 2011, Mobile HCI.

[39]  Sriram Subramanian,et al.  GesText: accelerometer-based gestural text-entry systems , 2010, CHI.

[40]  Alan J. Dix,et al.  A taxonomy for and analysis of multi-person-display ecosystems , 2009, Personal and Ubiquitous Computing.

[41]  Vibha Sazawal,et al.  TiltType: accelerometer-supported text entry for very small devices , 2002, UIST '02.

[42]  Joseph A. Paradiso,et al.  WristFlex: low-power gesture input with wrist-worn pressure sensors , 2014, UIST.

[43]  Jonna Häkkilä,et al.  Investigating interaction with a ring form factor , 2017, MUM.

[44]  Paolo Dario,et al.  Recognition of Daily Gestures with Wearable Inertial Rings and Bracelets , 2016, Sensors.

[45]  Edward Lank,et al.  Exploring At-Your-Side Gestural Interaction for Ubiquitous Environments , 2017, Conference on Designing Interactive Systems.

[46]  Xiang 'Anthony' Chen,et al.  Duet: exploring joint interactions on a smart phone and a smart watch , 2014, CHI.

[47]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[48]  Shwetak N. Patel,et al.  Finexus: Tracking Precise Motions of Multiple Fingertips Using Magnetic Sensing , 2016, CHI.

[49]  Suranga Nanayakkara,et al.  Digital Digits , 2015, ACM Comput. Surv..

[50]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[51]  Andrea Bianchi,et al.  Disambiguating touch with a smart-ring , 2017, AH.

[52]  I. Scott MacKenzie,et al.  Performance differences in the fingers, wrist, and forearm in computer input control , 1997, CHI.

[53]  Anind K. Dey,et al.  Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch , 2016, CHI.

[54]  Gregory D. Abowd,et al.  SynchroWatch: One-Handed Synchronous Smartwatch Gestures Using Correlation and Magnetic Sensing , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[55]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[56]  Xiang Cao,et al.  Comparing user performance with single-finger, whole-hand, and hybrid pointing devices , 2010, CHI.

[57]  Dan R. Olsen,et al.  Laser pointer interaction , 2001, CHI.

[58]  Niels Henze,et al.  PredicTouch: A System to Reduce Touchscreen Latency using Neural Networks and Inertial Measurement Units , 2017, ISS.

[59]  Sriram Subramanian,et al.  Tilt techniques: investigating the dexterity of wrist-based input , 2009, CHI.

[60]  Doug A. Bowman,et al.  Novel Uses of Pinch Gloves™ for Virtual Environment Interaction Techniques , 2002, Virtual Reality.

[61]  Yi-Ping Hung,et al.  ThumbRing: private interactions using one-handed thumb motion input on finger segments , 2016, MobileHCI Adjunct.

[62]  R A Abrams,et al.  Optimality in human motor performance: ideal control of rapid aimed movements. , 1988, Psychological review.

[63]  Li-Wei Chan,et al.  CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring , 2015, UIST.

[64]  Gregory D. Abowd,et al.  FingOrbits: interaction with wearables using synchronized thumb movements , 2017, SEMWEB.

[65]  Gregory D. Abowd,et al.  FingerPing: Recognizing Fine-grained Hand Poses using Active Acoustic On-body Sensing , 2018, CHI.

[66]  Sean White,et al.  uTrack: 3D input using two magnetic sensors , 2013, UIST.

[67]  Per Ola Kristensson,et al.  Investigating Tilt-based Gesture Keyboard Entry for Single-Handed Text Entry on Large Devices , 2017, CHI.

[68]  Daniel Vogel,et al.  Gunslinger: Subtle Arms-down Mid-air Interaction , 2015, UIST.

[69]  Andrew M. Webb,et al.  Wearables as Context for Guiard-abiding Bimanual Touch , 2016, UIST.

[70]  Gregory D. Abowd,et al.  Whoosh: non-voice acoustics for low-cost, hands-free, and rapid input on smartwatches , 2016, SEMWEB.