A client-server architecture for audio-supported mobile route guiding for hiking

An architecture for speech- and auditory cue-based route instructions is introduced in the paper. The server side architecture is composed of a set of web services. The architecture allows connecting different route services behind a single interface. The route instructions can be transferred as recorded speech, vibration patterns, auditory cues and in textual form between the server and client side. The OpenLS Route Service schema is extended to include in the route instruction responses references to the recorded speech, auditory cues, vibration patterns, encoded textual instructions and brief instructions. The auditory cues may include auditory icons, earcons or spearcons. The textual instructions are based on the Speech Synthesis Markup Language, and a text-to-speech engine on the client side can automatically translate them. The presented holistic approach is aiming to increase the accessibility of route services, especially for visually impaired and elderly people.

[1]  Michel Denis,et al.  Spatial Descriptions as Navigational Aids: A Cognitive Analysis of Route Directions , 1998, Kognitionswissenschaft.

[2]  Luca Chittaro,et al.  Augmenting audio messages with visual directions in mobile guides: an evaluation of three approaches , 2005, Mobile HCI.

[3]  Luca Chittaro,et al.  Geographic Data Visualization on Mobile Devices for User ’ s Navigation and Decision Support Activities , 2006 .

[4]  Paul Théberge,et al.  Chapter 17 Sound maps: Music and sound in cybercartography , 2005 .

[5]  David R. Morse,et al.  AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface , 2002, Personal and Ubiquitous Computing.

[6]  Susanne Boll,et al.  AccesSights - A Multimodal Location-Aware Mobile Tourist Information System , 2004, ICCHP.

[7]  John Krygier,et al.  Sound and Geographic Visualization , 1994 .

[8]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[9]  Kai-Florian Richter,et al.  Landmarks in OpenLS - A Data Structure for Cognitive Ergonomic Route Directions , 2006, GIScience.

[10]  Vassil N. Alexandrov,et al.  Use of auditory cues for wayfinding assistance in virtual environment: music aids route decision , 2008, VRST '08.

[11]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[12]  Craig T. Jin,et al.  Mobile Spatial Audio Communication System , 2004, ICAD.

[13]  Bruce N. Walker,et al.  Effect of Beacon Sounds on Navigation Performance in a Virtual Reality Environment , 2003 .

[14]  A. Weston The Soundscape: Our Sonic Environment and the Tuning of the World , 1996 .

[15]  Mari Laakso,et al.  Sonic Maps for Hiking—Use of Sound in Enhancing the Map Use Experience , 2010 .

[16]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[17]  Stephen A. Brewster,et al.  How can we best use landmarks to support older people in navigation? , 2005, Behav. Inf. Technol..

[18]  Benjamin Wright,et al.  Designing sound in cybercartography: from structured cinematic narratives to unpredictable sound/image interactions , 2008, Int. J. Geogr. Inf. Sci..

[19]  Pascal Neis,et al.  Extending the OGC OpenLS Route Service to 3D for an interoperable realisation of 3D focus maps with landmarks , 2008, J. Locat. Based Serv..

[20]  Konrad Tollmar,et al.  Exploring future challenges for haptic, audio and visual interfaces for mobile maps and location based services , 2009, LOCWEB '09.

[21]  Roberta L. Klatzky,et al.  Personal guidance system for the visually impaired , 1994, ASSETS.

[22]  Eamonn O'Neill,et al.  Auditory icon and earcon mobile service notifications: intuitiveness, learnability, memorability and preference , 2009, CHI.

[23]  Luca Chittaro,et al.  Geographical Data Visualization on Mobile Devices for Useer's Navigation and Decision Support Activites , 2007, Spatial Data on the Web.

[24]  Kwan Min Lee,et al.  Speech Versus Touch: A Comparative Study of the Use of Speech and DTMF Keypad for Navigation , 2005, Int. J. Hum. Comput. Interact..

[25]  Genny Tortora,et al.  Audio-Visual Information Clues about Geographic Data on Mobile Interfaces , 2009, PCM.

[26]  Clark C. Presson,et al.  Points of reference in spatial cognition: Stalking the elusive landmark* , 1988 .

[27]  Virpi Roto,et al.  Need for non-visual feedback with long response times in mobile HCI , 2005, WWW '05.