Chinese-Based Spearcons: Improving Pedestrian Navigation Performance in Eyes-Free Environment

This article presents nonspeech audio (i.e., English-based spearcons and Chinese-based spearcons) to represent distance, and forward-direction for pedestrian navigation in an eyes-free environment. Experiment in the field is carried out with the involvement of 10 participants (i.e., native Chinese) using within-subject design to evaluate English-based spearcons, Chinese-based spearcons, and Chinese text-to-speech (TTS). Results from the experiment suggest that Chinese-based spearcons are efficient in task completion compared to Chinese TTS. Moreover, Chinese-based spearcons are more effective in conveying distance and forward-direction compared to English-based spearcons in pedestrian navigation. Overall, participants have shown their satisfaction with Chinese-based spearcons as an auditory feedback in pedestrian navigation.

[1]  Mexhid Ferati,et al.  Audemes at work: Investigating features of non-speech sounds to maximize content recognition , 2012, Int. J. Hum. Comput. Stud..

[2]  M. Sile O'Modhrain,et al.  GpsTunes: controlling navigation via audio feedback , 2005, Mobile HCI.

[3]  Stephen A. Brewster,et al.  Parallel earcons: reducing the length of audio messages , 1995, Int. J. Hum. Comput. Stud..

[4]  Ling Chen,et al.  BlueView: a perception assistant system for the visually impaired , 2013, UbiComp.

[5]  Matt Jones,et al.  ONTRACK: Dynamically adapting music playback to support navigation , 2008, Personal and Ubiquitous Computing.

[6]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[7]  György Wersényi Auditory Representations of a Graphical User Interface for a Better Human-Computer Interaction , 2009, CMMR/ICAD.

[8]  C D Wickens,et al.  Electronic maps for terminal area navigation: effects of frame of reference and dimensionality. , 1996, The International journal of aviation psychology.

[9]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[10]  György Wersényi,et al.  Overview of auditory representations in human-machine interfaces , 2013, ACM Comput. Surv..

[11]  Bruce N. Walker,et al.  Spearcons Improve Navigation Performance and Perceived Speediness in Korean Auditory Menus , 2012 .

[12]  Thomas Hermann,et al.  TAXONOMY AND DEFINITIONS FOR SONIFICATION AND AUDITORY DISPLAY , 2008 .

[13]  Stephen Brewster,et al.  Using non-speech sound to overcome information overload , 1997 .

[14]  Markku Turunen,et al.  Soundmarks in Spoken Route Guidance , 2007 .

[15]  Martin Pielot,et al.  A Tactile Compass for Eyes-Free Pedestrian Navigation , 2011, INTERACT.

[16]  Myounghoon Jeon,et al.  “Spindex”: Accelerated Initial Speech Sounds Improve Navigation Performance in Auditory Menus , 2009 .

[17]  Ling Chen,et al.  Right mix of speech and non-speech: hybrid auditory feedback in mobility assistance of the visually impaired , 2014, Universal Access in the Information Society.

[18]  R. Shepard,et al.  Upward direction, mental rotation, and discrimination of left and right turns in maps , 1984, Cognition.

[19]  David R. Morse,et al.  AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface , 2002, Personal and Ubiquitous Computing.

[20]  Andrew Sears,et al.  Nomadic Speech-Based Text Entry: A Decision Model Strategy for Improved Speech to Text Processing , 2009, Int. J. Hum. Comput. Interact..

[21]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[22]  Lee Rainie,et al.  Adults and Cell Phone Distractions , 2010 .

[23]  G E Burntt,et al.  EMPIRICAL COMPARISON OF THE USE OF DISTANCE VERSUS LANDMARK INFORMATION WITHIN THE HUMAN-MACHINE INTERFACE FOR VEHICLE NAVIGATION SYSTEMS , 2002 .

[24]  Geoffrey Holmes,et al.  Navigation-by-Music for Pedestrians: an Initial Prototype and Evaluation , 2006 .

[25]  Barbara Leporini,et al.  Supporting orientation for blind people using museum guides , 2008, CHI Extended Abstracts.

[26]  L Gugerty,et al.  Seeing where you are heading: integrating environmental and egocentric reference frames in cardinal direction judgments. , 2001, Journal of experimental psychology. Applied.

[27]  Tilman Dingler,et al.  Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech , 2008 .

[28]  Abdelsalam Helal,et al.  Drishti: an integrated navigation system for visually impaired and disabled , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[29]  Ling Chen,et al.  Hybrid auditory feedback: a new method for mobility assistance of the visually impaired , 2012, ASSETS '12.

[30]  Bruce N. Walker,et al.  Navigation Efficiency of Two Dimensional Auditory Menus Using Spearcon Enhancements , 2008 .

[31]  Dylan M. Jones,et al.  Voice as interface: An overview , 1991, Int. J. Hum. Comput. Interact..

[32]  Myounghoon Jeon,et al.  Spindex (Speech Index) Improves Auditory Menu Acceptance and Navigation Performance , 2011, TACC.

[33]  Mexhid Ferati,et al.  Acoustic interaction design through "audemes": experiences with the blind , 2009, SIGDOC '09.

[34]  Marc Hassenzahl,et al.  Designing a Telephone-Based Interface for a Home Automation System , 2000, Int. J. Hum. Comput. Interact..

[35]  Cumhur Erkut,et al.  Design of an audio-based mobile journey planner application , 2011, MindTrek.

[36]  Ling Chen,et al.  A Comparative Study of Sonification Methods to Represent Distance and Forward-Direction in Pedestrian Navigation , 2014, Int. J. Hum. Comput. Interact..

[37]  Robert J. Lutz Prototyping and evaluation of landcons: auditory objects that support wayfinding for blind travelers , 2006, ASAC.

[38]  H Fongers,et al.  Human Factors in Transportation, Communication, Health, and the Workplace. , 2002 .

[39]  Myounghoon Jeon,et al.  Menu Navigation With In-Vehicle Technologies: Auditory Menu Cues Improve Dual Task Performance, Preference, and Workload , 2015, Int. J. Hum. Comput. Interact..

[40]  Myounghoon Jeon,et al.  Ergonomics Society of the Human Factors and Human Factors: The Journal , 2012 .

[41]  Abdelsalam Helal,et al.  Drishti: an integrated indoor/outdoor blind navigation system and service , 2004, Second IEEE Annual Conference on Pervasive Computing and Communications, 2004. Proceedings of the.