Investigating Microinteractions for People with Visual Impairments and the Potential Role of On-Body Interaction

For screenreader users who are blind or visually impaired (VI), today's mobile devices, while reasonably accessible, are not necessarily efficient. This inefficiency may be especially problematic for microinteractions, which are brief but high-frequency interactions that take only a few seconds for sighted users to complete (e.g., checking the weather or for new messages). One potential solution to support efficient non-visual microinteractions is on-body input, which appropriates the user's own body as the interaction medium. In this paper, we address two related research questions: How well are microinteractions currently supported for VI users' How should on-body interaction be designed to best support microinteractions for this user group? We conducted two studies: (1) an online survey to compare current microinteraction use between VI and sighted users (N=117); and (2) an in-person study where 12 VI screenreader users qualitatively evaluated a real-time on-body interaction system that provided three contrasting input designs. Our findings suggest that efficient microinteractions are not currently well-supported for VI users, at least using manual input, which highlights the need for new interaction approaches. On-body input offers this potential and the qualitative evaluation revealed tradeoffs with different on-body interaction techniques in terms of perceived efficiency, learnability, social acceptability, and ability to use on the go.

[1]  Thad Starner,et al.  Use of mobile appointment scheduling devices , 2004, CHI EA '04.

[2]  Richard E. Ladner,et al.  Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities , 2009, Assets '09.

[3]  Stephen A. Brewster,et al.  Investigating the effectiveness of tactile feedback for mobile touchscreens , 2008, CHI.

[4]  Patrick Baudisch,et al.  Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device , 2011, UIST.

[5]  Joaquim A. Jorge,et al.  From Tapping to Touching: Making Touch Screens Accessible to Blind Users , 2008, IEEE MultiMedia.

[6]  Jon Froehlich,et al.  Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras , 2016, ACM Trans. Access. Comput..

[7]  Chris Harrison,et al.  On-body interaction: armed and dangerous , 2012, TEI.

[8]  Daniel J. Hruschka,et al.  Reliability in Coding Open-Ended Data: Lessons Learned from HIV Behavioral Research , 2004 .

[9]  Suranga Nanayakkara,et al.  EyeRing: a finger-worn input device for seamless interactions with our surroundings , 2013, AH.

[10]  Sean Gustafson,et al.  PinchWatch: A Wearable Device for One-Handed Microinteractions , 2010 .

[11]  Halley Profita,et al.  The AT Effect: How Disability Affects the Perceived Social Acceptability of Head-Mounted Display Use , 2016, CHI.

[12]  Rama Chellappa,et al.  Localization of skin features on the hand and wrist from small image patches , 2016, 2016 23rd International Conference on Pattern Recognition (ICPR).

[13]  Stephen A. Brewster,et al.  Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.

[14]  Uran Oh,et al.  Current and future mobile and wearable device use by people with visual impairments , 2014, CHI.

[15]  Uran Oh,et al.  Design of and subjective response to on-body input for people with visual impairments , 2014, ASSETS.

[16]  Joaquim A. Jorge,et al.  Mnemonical Body Shortcuts for Interacting with Mobile Devices , 2007, Gesture Workshop.

[17]  Shiri Azenkot,et al.  Exploring the use of speech input by blind people on mobile devices , 2013, ASSETS.

[18]  Jürgen Steimle,et al.  More than touch: understanding how people use skin as an input surface for mobile computing , 2014, CHI.

[19]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[20]  Uran Oh,et al.  A Performance Comparison of On-Hand versus On-Phone Nonvisual Input by Blind and Sighted Users , 2015, ACM Trans. Access. Comput..

[21]  Ravi Kuber,et al.  An empirical investigation of the situationally-induced impairments experienced by blind mobile device users , 2016, W4A.

[22]  Daniel Ashbrook Enabling mobile microinteractions , 2010 .

[23]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[24]  Sri Hastuti Kurniawan,et al.  A qualitative study to support a blind photography mobile application , 2013, PETRA '13.

[25]  Jacob O. Wobbrock,et al.  In the shadow of misperception: assistive technology use and social interactions , 2011, CHI.

[26]  V. Braun,et al.  Using thematic analysis in psychology , 2006 .

[27]  Suranga Nanayakkara,et al.  FingerReader: A Wearable Device to Explore Printed Text on the Go , 2015, CHI.

[28]  Kent Lyons,et al.  Quickdraw: the impact of mobility and on-body placement on device access time , 2008, CHI.

[29]  Gregory D. Abowd,et al.  No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry , 2010, Pervasive.

[30]  Johannes Schöning,et al.  Falling asleep with Angry Birds, Facebook and Kindle: a large scale study on mobile application usage , 2011, Mobile HCI.

[31]  Ellen Yi-Luen Do,et al.  Don't mind me touching my wrist: a case study of interacting with on-body technology in public , 2013, ISWC '13.

[32]  Richard E. Ladner,et al.  PassChords: secure multi-touch authentication for blind people , 2012, ASSETS '12.

[33]  Virpi Roto,et al.  Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.

[34]  Patrick Baudisch,et al.  Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing , 2013, CHI.

[35]  Nicolai Marquardt,et al.  Extending a mobile device's interaction space through body-centric interaction , 2012, Mobile HCI.

[36]  Barry A. T. Brown,et al.  Smartwatch in vivo , 2016, CHI.

[37]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.