NAVIG: augmented reality guidance system for the visually impaired

Navigating complex routes and finding objects of interest are challenging tasks for the visually impaired. The project NAVIG (Navigation Assisted by artificial VIsion and GNSS) is directed toward increasing personal autonomy via a virtual augmented reality system. The system integrates an adapted geographic information system with different classes of objects useful for improving route selection and guidance. The database also includes models of important geolocated objects that may be detected by real-time embedded vision algorithms. Object localization (relative to the user) may serve both global positioning and sensorimotor actions such as heading, grasping, or piloting. The user is guided to his desired destination through spatialized semantic audio rendering, always maintained in the head-centered reference frame. This paper presents the overall project design and architecture of the NAVIG system. In addition, details of a new type of detection and localization device are presented. This approach combines a bio-inspired vision system that can recognize and locate objects very quickly and a 3D sound rendering system that is able to perceptually position a sound at the location of the recognized object. This system was developed in relation to guidance directives developed through participative design with potential users and educators for the visually impaired.

[1]  Christophe Jouffrais,et al.  Route selection algorithm for Blind pedestrian , 2010, ICCAS 2010.

[2]  Bernard Oriola,et al.  NAVIG: Guidance system for the visually impaired using virtual augmented reality , 2012 .

[3]  Malika Auvray,et al.  Perception With Compensatory Devices: From Sensory Substitution to Sensorimotor Extension , 2009, Cogn. Sci..

[4]  Bernard Oriola,et al.  Designing an assistive device for the blind based on object localization and augmented auditory reality , 2008, Assets '08.

[5]  S. Thorpe,et al.  NAVIG : Navigation Assisted by Artificial Vision and GNSS , 2010 .

[6]  Y. Bar-Shalom Tracking and data association , 1988 .

[7]  Bruce N. Walker,et al.  Navigation performance in a virtual environment with bonephones , 2005 .

[8]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[9]  Gary L. Allen,et al.  Applied Spatial Cognition: From Research to Cognitive Technology , 2007 .

[10]  David J. Kriegman,et al.  Selecting promising landmarks , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[11]  Adam C. Winstanley,et al.  Spatial Characteristics of Walking Areas for Pedestrian Navigation , 2009, 2009 Third International Conference on Multimedia and Ubiquitous Engineering.

[12]  T Letowski,et al.  Evaluation of acoustic beacon characteristics for navigation tasks , 2000, Ergonomics.

[13]  Florian Dramas,et al.  NAVIG : An Object Localization System for the Blind , 2010 .

[14]  Meera M. Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles (Abstract only) , 1989, SGCH.

[15]  Carl Machover,et al.  Virtual reality , 1994, IEEE Computer Graphics and Applications.

[16]  Denis Fize,et al.  Speed of processing in the human visual system , 1996, Nature.

[17]  Xu Liu,et al.  A camera phone based currency reader for the visually impaired , 2008, Assets '08.

[18]  Rob Kitchin,et al.  GIS and people with visual impairments or blindness: Exploring the potential for education, orientation, and navigation , 1997 .

[19]  P. Pizor Principles of Geographical Information Systems for Land Resources Assessment. , 1987 .

[20]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[21]  Philippe Truillet,et al.  Ivy: un bus logiciel au service du développement de prototypes de systèmes interactifs , 2002, IHM '02.

[22]  Mathieu Raynal,et al.  Navigation and space perception assistance for the visually impaired: The NAVIG project Assistance à la navigation et à la perception de l'espace pour les déficients visuels : le projet NAVIG , 2012 .

[23]  Brian F. G. Katz,et al.  Audio haptic feedbacks for an acquisition task in a multi-target context , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[24]  Thomas Strothotte MoBIC : user needs and preliminary design for a mobility aid for blind and elderly travelers, The European Context for Assistive Technology , 1995 .

[25]  Roberta L. Klatzky,et al.  Assisting Wayfinding in Visually Impaired Travelers , 2006 .

[26]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[27]  G. Allen Principles and practices for communicating route knowledge , 2000 .

[28]  Cindy Cappelle,et al.  Multi-sensors data fusion using Dynamic Bayesian Network for robotised vehicle geo-localisation , 2008, 2008 11th International Conference on Information Fusion.

[29]  Hugh F. Durrant-Whyte,et al.  Sensor Models and Multisensor Integration , 1988, Int. J. Robotics Res..

[30]  J. Berger Statistical Decision Theory and Bayesian Analysis , 1988 .

[31]  J. O'Regan,et al.  Learning to Perceive with a Visuo — Auditory Substitution System: Localisation and Object Recognition with ‘The Voice’ , 2007, Perception.

[32]  Gerhard Lakemeyer,et al.  Exploring artificial intelligence in the new millennium , 2003 .

[33]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[34]  Roberta L. Klatzky,et al.  Evaluation of spatial displays for navigation without sight , 2006, TAP.

[35]  Bruce N. Walker,et al.  Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice , 2006, Hum. Factors.

[36]  Brian F. G. Katz,et al.  The effect of spatialization in a data sonification exploration task. , 2008 .

[37]  J. Fletcher Spatial Representation in Blind Children. 1: Development Compared to Sighted Children , 1980 .

[38]  Sebastian Thrun,et al.  Robotic mapping: a survey , 2003 .

[39]  B. L. Bentzen,et al.  Audible Signage as a Wayfinding Aid: Verbal Landmark versus Talking Signs , 1995 .

[40]  Roberta L. Klatzky,et al.  Personal guidance system for the visually impaired , 1994, ASSETS.

[41]  Mathieu Gallay,et al.  Navigation assistance for blind pedestrians: guidelines for the design of devices and implications for spatial cognition* , 2013 .

[42]  James R. Marston,et al.  Personal Guidance System for People with Visual Impairment: A Comparison of Spatial Displays for Route Guidance , 2005, Journal of visual impairment & blindness.

[43]  Roberta L. Klatzky,et al.  A Geographical Information System for a GPS Based Personal Guidance System , 1998, Int. J. Geogr. Inf. Sci..

[44]  Matthijs L. Noordzij,et al.  The influence of visual experience on the ability to form spatial mental models based on route and survey descriptions , 2006, Cognition.

[45]  M. Denis The description of routes : A cognitive approach to the production of spatial discourse , 1997 .

[46]  Abdelsalam Helal,et al.  Drishti: an integrated navigation system for visually impaired and disabled , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[47]  Thomas Ertl,et al.  Design and development of an indoor navigation and object identification system for the blind , 2004, Assets '04.

[48]  Brian F. G. Katz,et al.  Morphocons: A New Sonification Concept Based on Morphological Earcons , 2012 .

[49]  Hugh F. Durrant-Whyte,et al.  Sensor Models and Multisensor Integration , 1988, Int. J. Robotics Res..

[50]  Christophe Jouffrais,et al.  Fusion of Artificial Vision and GPS to Improve Blind Pedestrian Positioning , 2011, 2011 4th IFIP International Conference on New Technologies, Mobility and Security.

[51]  Christophe Jouffrais,et al.  Artificial Vision for the Blind: a Bio-Inspired Algorithm for Objects and Obstacles Detection , 2010, Int. J. Image Graph..

[52]  Wolfgang L. Zagler,et al.  Computers Helping People with Special Needs, 12th International Conference, ICCHP 2010, Vienna, Austria, July 14-16, 2010, Proceedings, Part II , 2010, ICCHP.

[53]  H. B. Mitchell,et al.  Multi-Sensor Data Fusion: An Introduction , 2007 .

[54]  Mehdi Ammi,et al.  Multisensory VR exploration for computer fluid dynamics in the CoRSAIRe project , 2009, Virtual Reality.

[55]  James R. Marston,et al.  Cognitive load of navigating without vision when guided by virtual sound versus spatial language. , 2006, Journal of experimental psychology. Applied.

[56]  Gert Jan Gelderblom,et al.  Inventory of Electronic Mobility Aids for Persons with Visual Impairments: A Literature Review , 2008 .

[57]  M. Denis,et al.  Structural properties of spatial representations in blind people: Scanning images constructed from haptic exploration or from locomotion in a 3-D audio virtual environment , 2010, Memory & cognition.

[58]  Florence Gaunet,et al.  Exploring the Functional Specifications of a Localized Wayfinding Verbal Aid for Blind Pedestrians: Simple and Structured Urban Areas , 2005, Hum. Comput. Interact..

[59]  Romina Kühn,et al.  Mobility Impaired Pedestrians Are Not Cars: Requirements for the Annotation of Geographical Data , 2008, ICCHP.

[60]  Christophe Jouffrais,et al.  Assistive device for the blind based on object recognition: an application to identify currency bills , 2009, Assets '09.

[61]  Roberta L. Klatzky,et al.  Navigation System for the Blind: Auditory Display Modes and Guidance , 1998, Presence.

[62]  Mehdi Ammi,et al.  Multisensory VR interaction for protein-docking in the CoRSAIRe project , 2009, Virtual Reality.

[63]  Durand R. Begault,et al.  3-D Sound for Virtual Reality and Multimedia Cambridge , 1994 .

[64]  Abdelsalam Helal,et al.  Drishti: an integrated indoor/outdoor blind navigation system and service , 2004, Second IEEE Annual Conference on Pervasive Computing and Communications, 2004. Proceedings of the.

[65]  Young Soo Suh,et al.  The pedestrian navigation system using vision and inertial sensors , 2009, 2009 ICCAS-SICE.

[66]  Roberta L. Klatzky,et al.  Stated Preferences for Components of a Personal Guidance System for Nonvisual Navigation , 2004 .

[67]  Tilman Dingler,et al.  Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech , 2008 .