Accessible smartphones for blind users: A case study for a wayfinding system

While progress on assistive technologies have been made, some blind users still face several problems opening and using basic functionalities when interacting with touch interfaces. Sometimes, people with visual impairments may also have problems navigating autonomously, without personal assistance, especially in unknown environments. This paper presents a complete solution to manage the basic functions of a smartphone and to guide users using a wayfinding application. This way, a blind user could go to work from his home in an autonomous way using an adaptable wayfinding application on his smartphone. The wayfinding application combines text, map, auditory and tactile feedback for providing the information. Eighteen visually impaired users tested the application. Preliminary results from this study show that blind people and limited vision users can effectively use the wayfinding application without help. The evaluation also confirms the usefulness of extending the vibration feedback to convey distance information as well as directional information. The validation was successful for iOS and Android devices.

[1]  Matt Jones,et al.  ONTRACK: Dynamically adapting music playback to support navigation , 2008, Personal and Ubiquitous Computing.

[2]  Stephen A. Brewster,et al.  Audio Bubbles: Employing Non-speech Audio to Support Tourist Wayfinding , 2009, HAID.

[3]  David D. Oberhelman Archives & Museum Informatics , 1998 .

[4]  Keith Cheverst,et al.  Developing a context-aware electronic tourist guide: some issues and experiences , 2000, CHI.

[5]  Allison Woodruff,et al.  Electronic Guidebooks and Visitor Attention , 2001, ICHIM.

[6]  Jaime Sánchez,et al.  Mobile Messenger for the Blind , 2006, Universal Access in Ambient Intelligence Environments.

[7]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[8]  Martin Pielot,et al.  Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems , 2010, Pervasive.

[9]  Niels Henze,et al.  Supporting map-based wayfinding with tactile cues , 2009, Mobile HCI.

[10]  Davy Janssens,et al.  Building a validation measure for activity-based transportation models based on mobile phone data , 2014, Expert Syst. Appl..

[11]  Bernard Moulin,et al.  A Wayfinding Application Based on the Influence Area Spatial Model , 1999, FLAIRS.

[12]  Pier Luigi Emiliani,et al.  Design for All in action: An example of analysis and implementation , 2009, Expert Syst. Appl..

[13]  Hans-Jürgen Appelrath,et al.  Location-Aware Mobile Multimedia Applications on the Niccimon platform , 2004 .

[14]  R. Malaka,et al.  CRUMPET: creation of user-friendly mobile services personalised for tourism , 2001 .

[15]  Juan Martínez-Romo,et al.  GAT: Platform for automatic context-aware mobile services for m-tourism , 2013, Expert Syst. Appl..

[16]  Charlotte Magnusson,et al.  A real-world study of an audio-tactile tourist guide , 2012, Mobile HCI.

[17]  Stephen A. Brewster,et al.  Social gravity: a virtual elastic tether for casual, privacy-preserving pedestrian rendezvous , 2010, CHI.

[18]  Hyo-nam Lee,et al.  UMONS: Ubiquitous monitoring system in smart space , 2009, IEEE Transactions on Consumer Electronics.

[19]  Grigori E. Evreinov,et al.  Adaptive blind interaction technique for touchscreens , 2006, Universal Access in the Information Society.

[20]  Jia Zhang,et al.  Enhancing the precision of content analysis in content adaptation using entropy-based fuzzy reasoning , 2010, Expert Syst. Appl..

[21]  I. Scott MacKenzie,et al.  Eyes-free text entry with error correction on touchscreen mobile devices , 2010, NordiCHI.

[22]  Pablo A. Haya,et al.  Adaptation Technologies to Support Daily Living for All , 2013 .

[23]  Paulo Novais,et al.  Sensor-driven agenda for intelligent home care of the elderly , 2012, Expert Syst. Appl..

[24]  Mark Weiser,et al.  Some computer science issues in ubiquitous computing , 1993, CACM.

[25]  Ingrid M. Kanics,et al.  Tactile Acuity is Enhanced in Blindness , 2003, The Journal of Neuroscience.

[26]  M. Heller,et al.  Superior Haptic Perceptual Selectivity in Late-Blind and Very-Low-Vision Subjects , 2003, Perception.

[27]  Káthia Marçal de Oliveira,et al.  Transportation ontology definition and application for the content personalization of user interfaces , 2013, Expert Syst. Appl..

[28]  Joaquim A. Jorge,et al.  From Tapping to Touching: Making Touch Screens Accessible to Blind Users , 2008, IEEE MultiMedia.

[29]  Roberto Manduchi,et al.  Mobile Vision as Assistive Technology for the Blind: An Experimental Study , 2012, ICCHP.

[30]  Charlotte Magnusson,et al.  Pointing for non-visual orientation and navigation , 2010, NordiCHI.

[31]  Frank Dellaert,et al.  SWAN: System for Wearable Audio Navigation , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[32]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[33]  Sabine Timpf,et al.  A Conceptual Model of Wayfinding Using Multiple Levels of Abstraction , 1992, Spatio-Temporal Reasoning.

[34]  Gerhard Weber,et al.  Translating Floor Plans into Directions , 2012, ICCHP.

[35]  Stephen A. Brewster,et al.  Design principles to support older adults , 2004, Universal Access in the Information Society.

[36]  Martin Pielot,et al.  PocketNavigator: vibro-tactile waypoint navigation for everyday mobile devices , 2010, Mobile HCI.

[37]  Matt Jones,et al.  Sweep-Shake: finding digital resources in physical environments , 2009, Mobile HCI.

[38]  Joaquim A. Jorge,et al.  Blind people and mobile touch-based text-entry: acknowledging the need for different flavors , 2011, ASSETS.

[39]  Reginald G. Golledge,et al.  HUMAN WAYFINDING AND COGNITIVE MAPS , 2003 .

[40]  Maria De Marsico,et al.  Tunneling Model Between Adaptive e-Learning and Reputation-based Social Activities , 2010, 2010 Workshops on Database and Expert Systems Applications.

[41]  Gregory D. Abowd,et al.  No-Look Notes: Accessible Eyes-Free Multi-touch Text Entry , 2010, Pervasive.

[42]  David R. Morse,et al.  AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface , 2002, Personal and Ubiquitous Computing.

[43]  Jack M. Loomis,et al.  GPS-Based Navigation Systems for the Visually Impaired , 2001 .

[44]  Patrick Langdon,et al.  Characterising user capabilities to support inclusive design evaluation , 2007, Universal Access in the Information Society.

[45]  Yao-Jen Chang,et al.  A context aware handheld wayfinding system for individuals with cognitive impairments , 2008, Assets '08.

[46]  Chee Peng Lim,et al.  Prediction of pedestrians routes within a built environment in normal conditions , 2014, Expert Syst. Appl..

[47]  Wilko Heuten,et al.  Multimodal interaction with mobile applications , 2005, 16th International Workshop on Database and Expert Systems Applications (DEXA'05).

[48]  Krzysztof Z. Gajos,et al.  Ability-Based Design: Concept, Principles and Examples , 2011, TACC.