Analyzing visually impaired people’s touch gestures on smartphones

We present an analysis of how visually impaired people perform gestures on touch-screen smartphones and report their preferences, explaining the procedure and technical implementation that we followed to collect gesture samples. To that end, we recruited 36 visually impaired participants and divided them into two main groups of low-vision and blind people respectively. We then examined their touch-based gesture preferences in terms of number of strokes, multi-touch, and shape angle, as well as their execution in geometric, kinematic and relative terms. For this purpose, we developed a wireless system to simultaneously record sample gestures from several participants, with the possibility of monitoring the capture process. Our results are consistent with previous research regarding the preference of visually impaired users for simple gestures: with one finger, a single stroke, and in one or two cardinal directions. Of the two groups of participants, blind people are less consistent with multi-stroke gestures. In addition, they are more likely than low-vision people to go outside the bounds of the display in the absence of its physical delimitation of, especially with multi-touch gestures. In the case of more complex gestures, rounded shapes are greatly preferred to angular ones, especially by blind people, who have difficulty performing straight gestures with steep or right angles. Based on these results and on previous related research, we offer suggestions to improve gesture accessibility of handheld touchscreen devices.

[1]  Andre Charland,et al.  Mobile application development , 2011, Commun. ACM.

[2]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[3]  David Benyon,et al.  Accommodating Individual Differences through an Adaptive User Interface , 1997 .

[4]  Laurent Grisoni,et al.  Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation , 2014, ICMI.

[5]  Alan Hedge,et al.  Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction , 2001 .

[6]  James A. Pittman,et al.  Recognizing handwritten text , 1991, CHI.

[7]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[8]  Uran Oh,et al.  Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures , 2013, ASSETS.

[9]  Lisa Anthony,et al.  A lightweight multistroke recognizer for user interface prototypes , 2010, Graphics Interface.

[10]  Yang Li,et al.  Experimental analysis of touch-screen gesture designs in mobile environments , 2011, CHI.

[11]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[12]  Lisa Anthony,et al.  Interaction and recognition challenges in interpreting children's touch and gesture input on mobile devices , 2012, ITS.

[13]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[14]  A. Kappers,et al.  Differences between Early-Blind, Late-Blind, and Blindfolded-Sighted People in Haptic Spatial-Configuration Learning and Resulting Memory Traces , 2007, Perception.

[15]  Randall Davis,et al.  HMM-based efficient sketch recognition , 2005, IUI.

[16]  D. Moore,et al.  Order Effects in Preference Judgments: Evidence for Context Dependence in the Generation of Preferences. , 1999, Organizational behavior and human decision processes.

[17]  Constantine Stephanidis,et al.  Universal Access in Human-Computer Interaction. Access to Interaction , 2015, Lecture Notes in Computer Science.

[18]  James A. Landay,et al.  Visual similarity of pen gestures , 2000, CHI.

[19]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[20]  Geehyuk Lee,et al.  Expanding touch input vocabulary by using consecutive distant taps , 2014, CHI.

[21]  Louis Vuurpijl,et al.  Iconic and multi-stroke gesture recognition , 2009, Pattern Recognit..

[22]  David Goldberg,et al.  Touch-typing with a stylus , 1993, INTERCHI.

[23]  Radu-Daniel Vatavu,et al.  Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.

[24]  Ambar G. Rao,et al.  Too Little, Too Early: Introduction Timing and New Product Performance in the Personal Digital Assistant Industry , 1997 .

[25]  Geoff Walker,et al.  A review of technologies for sensing contact location on the surface of a display , 2012 .

[26]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[27]  Michael Schmidt,et al.  Multitouch Haptic Interaction , 2009, HCI.

[28]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[29]  John Morris,et al.  Blind and Deaf Consumer Preferences for Android and iOS Smartphones , 2014 .

[30]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[31]  Clemens Holzmann,et al.  Heat maps as a usability tool for multi-touch interaction in mobile applications , 2012, MUM.

[32]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[33]  Marco Romano,et al.  Understanding Touch and Motion Gestures for Blind People on Mobile Devices , 2015, INTERACT.

[34]  Antonio Cisternino,et al.  A Compositional Model for Gesture Definition , 2012, HCSE.

[35]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[36]  Joaquim A. Jorge,et al.  Blind people and mobile touch-based text-entry: acknowledging the need for different flavors , 2011, ASSETS.

[37]  Sharon L. Oviatt,et al.  Designing the User Interface for Multimodal Speech and Pen-Based Gesture Applications: State-of-the-Art Systems and Future Research Directions , 2000, Hum. Comput. Interact..

[38]  Rainer Groh,et al.  Towards a formalization of multi-touch gestures , 2010, ITS '10.

[39]  Beryl Plimmer,et al.  Multimodal collaborative handwriting training for visually-impaired people , 2008, CHI.

[40]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[41]  Radu-Daniel Vatavu,et al.  Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations , 2014, ICMI.

[42]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[43]  Meredith Ringel Morris,et al.  Touchplates: low-cost tactile overlays for visually impaired touch screen users , 2013, ASSETS.

[44]  Joaquim A. Jorge,et al.  Mobile text-entry models for people with disabilities , 2008, ECCE '08.

[45]  Jon Froehlich,et al.  Age-related differences in performance with touchscreens compared to traditional mouse input , 2013, CHI.

[46]  Kyle Montague,et al.  Getting Smartphones to Talkback: Understanding the Smartphone Adoption Process of Blind Users , 2015, ASSETS.

[47]  Barbara Leporini,et al.  Exploring Visually Impaired People's Gesture Preferences for Smartphones , 2015, CHItaly.

[48]  Stephen A. Brewster,et al.  Investigating touchscreen accessibility for people with visual impairments , 2008, NordiCHI.

[49]  Shiri Azenkot,et al.  Exploring the use of speech input by blind people on mobile devices , 2013, ASSETS.

[50]  Uran Oh,et al.  The challenges and potential of end-user gesture customization , 2013, CHI.

[51]  Laurent Grisoni,et al.  Small, medium, or large?: estimating the user-perceived scale of stroke gestures , 2013, CHI.

[52]  M. Kendall,et al.  The Problem of $m$ Rankings , 1939 .

[53]  Shuang Xu Improving Accessibility Design on Touchscreens , 2015, HCI.

[54]  Fadi Biadsy,et al.  JustSpeak: enabling universal voice control on Android , 2014, W4A.

[55]  Barbara Leporini,et al.  Designing a text entry multimodal keypad for blind users of touchscreen mobile phones , 2014, ASSETS.

[56]  Vikas Luthra,et al.  Understanding, Evaluating and Analyzing Touch Screen Gestures for Visually Impaired Users in Mobile Environment , 2015, HCI.

[57]  I. Scott MacKenzie,et al.  The Immediate Usability of Graffiti , 1997, Graphics Interface.

[58]  Frode Eika Sandnes,et al.  Making touch-based kiosks accessible to blind users through simple gestures , 2012, Universal Access in the Information Society.

[59]  Sargur N. Srihari,et al.  On-Line and Off-Line Handwriting Recognition: A Comprehensive Survey , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[60]  Grégoire Lefebvre,et al.  Gesture based interaction for visually-impaired people , 2010, NordiCHI.

[61]  L. R. Rabiner,et al.  A comparative study of several dynamic time-warping algorithms for connected-word recognition , 1981, The Bell System Technical Journal.

[62]  Radu-Daniel Vatavu,et al.  Relative accuracy measures for stroke gestures , 2013, ICMI '13.

[63]  Andrew Sears,et al.  Representing users in accessibility research , 2012, TACC.