Design of Web-Based Tools to Study Blind People's Touch-Based Interaction with Smartphones

Nowadays touchscreen smartphones are the most common kind of mobile devices. However, gesture-based interaction is a difficult task for most visually impaired people, and even more so for blind people. This difficulty is compounded by the lack of standard gestures and the differences between the main screen reader platforms available on the market. Therefore, our goal is to investigate the differences and preferences in touch gesture performance on smartphones among visually impaired people. During our study, we implemented a web-based wireless system to facilitate the capture of participants’ gestures. In this paper we present an overview of both the study and the system used.

[1]  Richard E. Ladner,et al.  Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities , 2009, Assets '09.

[2]  Barbara Leporini,et al.  Designing a text entry multimodal keypad for blind users of touchscreen mobile phones , 2014, ASSETS.

[3]  Andrew Sears,et al.  Representing users in accessibility research , 2012, TACC.

[4]  Chuan Heng Foh,et al.  A practical path loss model for indoor WiFi positioning enhancement , 2007, 2007 6th International Conference on Information, Communications & Signal Processing.

[5]  Frode Eika Sandnes,et al.  Making touch-based kiosks accessible to blind users through simple gestures , 2012, Universal Access in the Information Society.

[6]  Laurent Grisoni,et al.  Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation , 2014, ICMI.

[7]  Barbara Leporini,et al.  Interacting with mobile devices via VoiceOver: usability and accessibility issues , 2012, OZCHI.

[8]  Andre Charland,et al.  Mobile application development , 2011, Commun. ACM.

[9]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[10]  Michael Schmidt,et al.  Multitouch Haptic Interaction , 2009, HCI.

[11]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[12]  Sriram Subramanian,et al.  Talking about tactile experiences , 2013, CHI.

[13]  Vinton G. Cerf Computer science revisited , 2012, CACM.

[14]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[15]  Uran Oh,et al.  Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures , 2013, ASSETS.

[16]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[17]  Vinton G. Cerf,et al.  Why is accessibility so hard? , 2012, CACM.