The Impact of Low Vision on Touch-Gesture Articulation on Mobile Devices

A comprehensive study of stroke gestures on mobile device touchscreens by users with low vision reveals that some aspects of gesture articulation, such as gesture length or size, are minimally impacted by the impairment, while other aspects, such as production times, reveal suboptimal visuomotor routines of eye-hand coordination. Informed by the studys findings, the authors outline design guidelines for effective stroke-gesture input on mobile devices under low vision conditions.

[1]  Shaomei Wu,et al.  Automatic Alt-text: Computer-generated Image Descriptions for Blind Users on a Social Network Service , 2017, CSCW.

[2]  Radu-Daniel Vatavu,et al.  Relative accuracy measures for stroke gestures , 2013, ICMI '13.

[3]  Shumin Zhai,et al.  Predicting Finger-Touch Accuracy Based on the Dual Gaussian Distribution Model , 2016, UIST.

[4]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[5]  Uran Oh,et al.  Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures , 2013, ASSETS.

[6]  Barbara Leporini,et al.  Analyzing visually impaired people’s touch gestures on smartphones , 2017, Multimedia Tools and Applications.

[7]  M. Goodale,et al.  Sight Unseen: An Exploration of Conscious and Unconscious Vision , 2004 .

[8]  Lisa Anthony,et al.  A lightweight multistroke recognizer for user interface prototypes , 2010, Graphics Interface.

[9]  Yuhang Zhao,et al.  Finding a store, searching for a product: a study of daily challenges of low vision people , 2016, UbiComp.

[10]  J. Loomis,et al.  Reproduction of Object Shape is More Accurate without the Continued Availability of Visual Information , 1998, Perception.

[11]  M. Goodale Transforming vision into action , 2011, Vision Research.

[12]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[13]  Meredith Ringel Morris,et al.  Touchplates: low-cost tactile overlays for visually impaired touch screen users , 2013, ASSETS.

[14]  Yuhang Zhao,et al.  CueSee: exploring visual cues for people with low vision to facilitate a visual search task , 2016, UbiComp.

[15]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[16]  Radu-Daniel Vatavu,et al.  Improving Gesture Recognition Accuracy on Touch Screens for Users with Low Vision , 2017, CHI.

[17]  Radu-Daniel Vatavu,et al.  Understanding the consistency of users' pen and finger stroke gesture articulation , 2013, Graphics Interface.

[18]  Yuhang Zhao,et al.  How People with Low Vision Access Computing Devices: Understanding Challenges and Opportunities , 2016, ASSETS.

[19]  Yuhang Zhao,et al.  Understanding Low Vision People's Visual Perception on Commercial Augmented Reality Glasses , 2017, CHI.

[20]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.