Interactive 3D sonification for the exploration of city maps

Blind or visually impaired people usually do not leave their homes without any assistance, in order to visit unknown cities or places. One reason for this dilemma is, that it is hardly possible for them to gain a non-visual overview about the new place, its landmarks and geographic entities already at home. Sighted people can use a printed or digital map to perform this task. Existing haptic and acoustic approaches today do not provide an economic way to mediate the understanding of a map and relations between objects like distance, direction, and object size. We are providing an interactive three-dimensional sonification interface to explore city maps. A blind person can build a mental model of an area's structure by virtually exploring an auditory map at home. Geographic objects and landmarks are presented by sound areas, which are placed within a sound room. Each type of object is associated with a different sound and can therefore be identified. By investigating the auditory map, the user perceives an idea of the various objects, their directions and relative distances. First user tests show, that users are able to reproduce a sonified city map, which comes close to the original visual city map. With our approach exploring a map with non-speech sound areas provide a new user interface metaphor that offers its potential not only for blind and visually impaired persons but also to applications for sighted persons.

[1]  Jezekiel Ben-Arie,et al.  Conveying visual information with spatial auditory patterns , 1996, IEEE Trans. Speech Audio Process..

[2]  H. Petrie,et al.  MOBIC: Designing a Travel Aid for Blind and Elderly People , 1996, Journal of Navigation.

[3]  Simon Holland,et al.  AudioGPS: spatial audio in a minimal attention interface , 2001 .

[4]  Niels Henze,et al.  Non-Intrusive Somatosensory Navigation Support for Blind Pedestrians , 2006 .

[5]  Robert J. Beaton,et al.  An Investigation of Navigation Processes in Human Locomotor Behavior , 2000 .

[6]  Andrea Miene,et al.  Automatic Annotation of Geographic Maps , 2006, ICCHP.

[7]  John G. Neuhoff,et al.  Sonification Report: Status of the Field and Research Agenda Prepared for the National Science Foundation by members of the International Community for Auditory Display , 1999 .

[8]  David A. Ross,et al.  Wearable interfaces for orientation and wayfinding , 2000, Assets '00.

[9]  David McGookin,et al.  ADVANTAGES AND ISSUES WITH CONCURRENT AUDIO PRESENTATION AS PART OF AN AUDITORY DISPLAY , 2006 .

[10]  Kirsten Rassmus-Gröhn,et al.  Using a force feedback device to present graphical information to people with visual disabilities , 1998 .

[11]  E. Brazil,et al.  Investigating concurrent auditory icon recognition , 2006 .

[12]  Tapio Lokki,et al.  Navigation with auditory cues in a virtual environment , 2005, IEEE MultiMedia.

[13]  Ben Shneiderman,et al.  Interactive sonification of choropleth maps , 2005, IEEE MultiMedia.

[14]  M. H. Heycock,et al.  Papers , 1971, BMJ : British Medical Journal.

[15]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[16]  Haixia Zhao Interactive sonification of geo-referenced data , 2005, CHI EA '05.

[17]  Bruce N. Walker,et al.  Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice , 2006, Hum. Factors.