Haptic access to conventional 2D maps for the visually impaired

This paper describes a framework of map image analysis and presentation of the semantic information to blind users using alternative modalities (i.e. haptics and audio). The resulting haptic-audio representation of the map is used by the blind for navigation and path planning purposes. The proposed framework utilizes novel algorithms for the segmentation of the map images using morphological filters that are able to provide indexed information on both the street network structure and the positions of the street names in the map. Next, off-the-shelf OCR and TTS algorithms are utilized to convert the visual information of the street names into audio messages. Finally, a grooved-line-map representation of the map network is generated and the blind users are able to investigate it using a haptic device. While navigating, audio messages are displayed providing information about the current position of the user (e.g. street name, cross-road notification and so on). Experimental results illustrate that the proposed system is considered very promising for the blind users and has been reported to be a very fast means of generating maps for the blind when compared to other traditional methods like Braille images.

[1]  Philippe Salembier,et al.  Region-Based Filtering of Images and Video Sequences , 2000 .

[2]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[3]  Bernard Gosselin,et al.  Mobile Reading Assistant for Blind People , 2004 .

[4]  Ferran Marqués,et al.  Region-based representations of image and video: segmentation tools for multimedia services , 1999, IEEE Trans. Circuits Syst. Video Technol..

[5]  Philippe Salembier,et al.  Antiextensive connected operators for image and sequence processing , 1998, IEEE Trans. Image Process..

[6]  Stephen A. Brewster,et al.  Constructing sonified haptic line graphs for the blind student: first steps , 2000, Assets '00.

[7]  G. Naghdy,et al.  Sketch-based image retrieval using angular partitioning , 2003, Proceedings of the 3rd IEEE International Symposium on Signal Processing and Information Technology (IEEE Cat. No.03EX795).

[8]  Bernard Gosselin,et al.  An Embedded Application for Degraded Text Recognition , 2005, EURASIP J. Adv. Signal Process..

[9]  Nikolaos Grammalidis,et al.  Three-Dimensional Facial Adaptation for MPEG-4 Talking Heads , 2002, EURASIP J. Adv. Signal Process..

[10]  Oussama Khatib,et al.  The haptic display of complex graphical environments , 1997, SIGGRAPH.

[11]  Lee Wee Ching,et al.  SINVI: smart indoor navigation for the visually impaired , 2004, ICARCV 2004 8th Control, Automation, Robotics and Vision Conference, 2004..

[12]  D. Tzovaras,et al.  Design and implementation of haptic virtual environments for the training of the visually impaired , 2004, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[13]  Hui Tang,et al.  An oral tactile interface for blind navigation , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[14]  Dimitrios Tzovaras,et al.  Design and implementation of haptic virtual environments for the training of the visually impaired , 2004 .

[15]  Abdolah Chalechale,et al.  Sketch-based image matching Using Angular partitioning , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[16]  Bernard Gosselin,et al.  Sypole: A mobile assistant for the blind , 2005, 2005 13th European Signal Processing Conference.