Vibro-Tactile Enrichment Improves Blind User Interaction with Mobile Touchscreens

Interaction via mobile devices is a challenge for blind users, who often encounter severe accessibility and usability problems. The main issues are due to the lack of hardware keys, making it difficult to quickly reach an area or activate functions, and to the new way of interacting via touchscreen. A touchscreen has no specific reference points detectable by feel, so a blind user cannot easily understand exactly where (s)he is positioned on the interface nor readily find a specific item/function. Alternative ways to provide content are mainly vocal and may be inadequate in some situations, e.g., noisy environments. In this study we investigate enriching the user interfaces of touchscreen mobile devices to facilitate blind users’ orientation. We propose a possible solution for improving interaction based on the vibro-tactile channel. After introducing the idea behind our approach, two implemented Android Apps including the enriched user interfaces are shown and discussed.

[1]  José Manuel Moya,et al.  A Methodology for Developing Accessible Mobile Platforms over Leading Devices for Visually Impaired People , 2011, IWAAL.

[2]  Uwe Hansmann,et al.  Pervasive Computing , 2003 .

[3]  Koji Yatani,et al.  SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices , 2009, UIST '09.

[4]  Gregory D. Abowd,et al.  No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry , 2010, Pervasive.

[5]  M. Hallett,et al.  Activation of the primary visual cortex by Braille reading in blind subjects , 1996, Nature.

[6]  Carole A. Goble,et al.  The travails of visually impaired web travellers , 2000, HYPERTEXT '00.

[7]  Ricardo Costa,et al.  Ambient Assisted Living , 2009 .

[8]  Ravi Kuber,et al.  Towards identifying distinguishable tactons for use with mobile devices , 2009, Assets '09.

[9]  Martin Pielot,et al.  TouchOver map: audio-tactile exploration of interactive maps , 2011, Mobile HCI.

[10]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[11]  Charlotte Magnusson,et al.  Pointing for non-visual orientation and navigation , 2010, NordiCHI.

[12]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[13]  Topi Kaaresoja,et al.  Feel-good touch: finding the most pleasant tactile feedback for a mobile touch screen button , 2008, ICMI '08.

[14]  Barbara Leporini,et al.  Interacting with mobile devices via VoiceOver: usability and accessibility issues , 2012, OZCHI.