Keep the Phone in Your Pocket

Previous studies have shown that visually impaired users face a unique set of pain points in smartphone interaction including locating and removing the phone from a pocket, two-handed interaction while holding a cane, and keeping personal data private in a public setting. In this paper, we present a ring-based input interaction that enables in-pocket smartphone operation. By wearing a ring with an Inertial Measurement Unit on the index finger, users can perform gestures on any surface (e.g., tables, thighs) using subtle, one-handed gestures and receive auditory feedback via earphones. We conducted participatory studies to obtain a set of versatile commands and corresponding gestures. We subsequently trained an SVM model to recognize these gestures and achieved a mean accuracy of 95.5% on 15 classifications. Evaluation results showed that our ring interaction is more efficient than some baseline phone interactions and is easy, private, and fun to use.

[1]  Mike Y. Chen,et al.  TouchRing: subtle and always-available input using a multi-touch ring , 2016, MobileHCI Adjunct.

[2]  Hirotaka Osawa,et al.  iRing: intelligent ring using infrared reflection , 2012, UIST.

[3]  Sang Ho Yoon,et al.  Plex: finger-worn textile sensor for mobile interaction during activities , 2014, UbiComp Adjunct.

[4]  Panlong Yang,et al.  Poster: Sonicnect: Accurate Hands-Free Gesture Input System with Smart Acoustic Sensing , 2016, MobiSys '16 Companion.

[5]  Sang Ho Yoon,et al.  TIMMi: Finger-worn Textile Input Device with Multimodal Sensing in Mobile Interaction , 2015, TEI.

[6]  Patrick Baudisch,et al.  Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing , 2013, CHI.

[7]  Patrick Baudisch,et al.  Disappearing mobile devices , 2009, UIST '09.

[8]  Suranga Nanayakkara,et al.  EyeRing: a finger-worn input device for seamless interactions with our surroundings , 2013, AH.

[9]  Lawrence H. Kim,et al.  SwarmHaptics: Haptic Display with Swarm Robots , 2019, CHI.

[10]  Gregory D. Abowd,et al.  FingOrbits: interaction with wearables using synchronized thumb movements , 2017, SEMWEB.

[11]  Leah Findlater,et al.  Accessibility in context: understanding the truly mobile experience of smartphone users with motor impairments , 2014, ASSETS.

[12]  Gierad Laput,et al.  SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin , 2016, CHI.

[13]  Yi-Ping Hung,et al.  ThumbRing: private interactions using one-handed thumb motion input on finger segments , 2016, MobileHCI Adjunct.

[14]  Jean Vanderdonckt,et al.  Ring x2: Designing Gestures for Smart Rings using Temporal Calculus , 2018, Conference on Designing Interactive Systems.

[15]  Chris Harrison,et al.  PocketTouch: through-fabric capacitive touch input , 2011, UIST '11.

[16]  Ken Hinckley,et al.  LightRing: always-available 2D input on any surface , 2014, UIST.

[17]  Xing-Dong Yang,et al.  EarTouch: Facilitating Smartphone Use for Visually Impaired People in Mobile and Public Scenarios , 2019, CHI.

[18]  Bongwon Suh,et al.  PairRing: A Ring-Shaped Rotatable Smartwatch Controller , 2018, CHI Extended Abstracts.

[19]  Junbo Wang,et al.  Magic Ring: a self-contained gesture input device on finger , 2013, MUM.

[20]  Uran Oh,et al.  A Performance Comparison of On-Hand versus On-Phone Nonvisual Input by Blind and Sighted Users , 2015, ACM Trans. Access. Comput..

[21]  Yuanchun Shi,et al.  Lip-Interact: Improving Mobile Device Interaction with Silent Speech Commands , 2018, UIST.

[22]  Xing-Dong Yang,et al.  PokeRing: Notifications by Poking Around the Finger , 2018, CHI.

[23]  Patrick Baudisch,et al.  Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device , 2011, UIST.

[24]  Sahin Albayrak,et al.  eRing: multiple finger gesture recognition with one ring using an electric field , 2015, iWOAR.

[25]  Yiqiang Chen,et al.  A ring-shaped interactive device for large remote display and mobile device control , 2011, UbiComp '11.

[26]  Li-Wei Chan,et al.  CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring , 2015, UIST.

[27]  Bongwon Suh,et al.  OctaRing: Examining Pressure-Sensitive Multi-Touch Input on a Finger Ring Device , 2016, UIST.

[28]  Andrea Bianchi,et al.  Disambiguating touch with a smart-ring , 2017, AH.

[29]  Katrin Wolf,et al.  PickRing: seamless interaction through pick-up detection , 2015, AH.

[30]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[31]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[32]  Jonna Häkkilä,et al.  Tap input as an embedded interaction method for mobile devices , 2007, TEI.

[33]  Gregory D. Abowd,et al.  FingerSound , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[34]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[35]  Suranga Nanayakkara,et al.  FingerReader2.0 , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[36]  Chris Harrison,et al.  Whack gestures: inexact and inattentive interaction with mobile devices , 2010, TEI '10.

[37]  Anind K. Dey,et al.  Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch , 2016, CHI.

[38]  Jon Froehlich,et al.  Investigating Microinteractions for People with Visual Impairments and the Potential Role of On-Body Interaction , 2017, ASSETS.

[39]  Xianghua Ding,et al.  "I Bought This for Me to Look More Ordinary": A Study of Blind People Doing Online Shopping , 2019, CHI.

[40]  Sebastian O. H. Madgwick,et al.  An efficient orientation filter for inertial and inertial / magnetic sensor arrays , 2010 .

[41]  Jon Froehlich,et al.  TouchCam , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[42]  Gaetano Borriello,et al.  Mobile Touch-Free Interaction for Global Health , 2015, HotMobile.

[43]  Shuchang Xu,et al.  Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor , 2019, UIST.

[44]  Prateek Jain,et al.  GesturePod: Enabling On-device Gesture-based Interaction for White Cane Users , 2019, UIST.

[45]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[46]  Catherine Feng Designing Wearable Mobile Device Controllers for Blind People: A Co-Design Approach , 2016, ASSETS.

[47]  Desney S. Tan,et al.  FingerIO: Using Active Sonar for Fine-Grained Finger Tracking , 2016, CHI.

[48]  Gierad Laput,et al.  ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers , 2016, UIST.

[49]  Uran Oh,et al.  Current and future mobile and wearable device use by people with visual impairments , 2014, CHI.

[50]  Sean White,et al.  uTrack: 3D input using two magnetic sensors , 2013, UIST.

[51]  Sang Ho Yoon,et al.  TRing: Instant and Customizable Interactions with Objects Using an Embedded Magnet and a Finger-Worn Device , 2016, UIST.