Ergonomics Society of the Human Factors and Human Factors: The Journal

Objective: The goal of this project is to evaluate a new auditory cue, which the authors call spearcons, in comparison to other auditory cues with the aim of improving auditory menu navigation. Background: With the shrinking displays of mobile devices and increasing technology use by visually impaired users, it becomes important to improve usability of non-graphical user interface (GUI) interfaces such as auditory menus. Using nonspeech sounds called auditory icons (i.e., representative real sounds of objects or events) or earcons (i.e., brief musical melody patterns) has been proposed to enhance menu navigation. To compensate for the weaknesses of traditional nonspeech auditory cues, the authors developed spearcons by speeding up a spoken phrase, even to the point where it is no longer recognized as speech. Method: The authors conducted five empirical experiments. In Experiments 1 and 2, they measured menu navigation efficiency and accuracy among cues. In Experiments 3 and 4, they evaluated learning rate of cues and speech itself. In Experiment 5, they assessed spearcon enhancements compared to plain TTS (text to speech: speak out written menu items) in a two-dimensional auditory menu. Results: Spearcons outperformed traditional and newer hybrid auditory cues in navigation efficiency, accuracy, and learning rate. Moreover, spearcons showed comparable learnability as normal speech and led to better performance than speech-only auditory cues in two-dimensional menu navigation. Conclusion: These results show that spearcons can be more effective than previous auditory cues in menu-based interfaces. Application: Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.

[1]  Alistair D. N. Edwards,et al.  Soundtrack: An Auditory Interface for Blind Users (Abstract Only) , 1989, SGCH.

[2]  Stephen A. Brewster,et al.  An evaluation of earcons for use in auditory human-computer interfaces , 1993, INTERCHI.

[3]  Myounghoon Jeon,et al.  Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies , 2009, AutomotiveUI.

[4]  Stephen A. Brewster,et al.  Using nonspeech sounds to provide navigation cues , 1998, TCHI.

[5]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[6]  Myounghoon Jeon,et al.  Spindex (Speech Index) Improves Auditory Menu Acceptance and Navigation Performance , 2011, TACC.

[7]  A BrewsterStephen Using nonspeech sounds to provide navigation cues , 1998 .

[8]  Arthur I. Karshmer,et al.  An experimental sound-based hierarchical menu navigation system for visually handicapped use of graphical user interfaces , 1994, Assets '94.

[9]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[10]  Catherine Guastavino,et al.  USABILITY OF NON-SPEECH SOUNDS IN USER INTERFACES , 2008 .

[11]  Bruce N. Walker,et al.  LEARNING RATES FOR AUDITORY MENUS ENHANCED WITH SPEARCONS VERSUS EARCONS , 2007 .

[12]  James Thatcher Screen reader/2: access to OS/2 and the graphical user interface , 1994, Assets '94.

[13]  Bruce N. Walker,et al.  Navigation Efficiency of Two Dimensional Auditory Menus Using Spearcon Enhancements , 2008 .

[14]  A. Wilgus,et al.  High quality time-scale modification for speech , 1985, ICASSP '85. IEEE International Conference on Acoustics, Speech, and Signal Processing.

[15]  Frank Dellaert,et al.  SWAN: System for Wearable Audio Navigation , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[16]  Ben Shneiderman,et al.  The Psychology of Menu Selection: Designing Cognitive Control at the Human/Computer Interface , 1991 .

[17]  Stephen Brewster,et al.  Designing non-speech sounds to support navigation in mobile phone menus , 2000 .

[18]  Elizabeth D. Mynatt,et al.  Mapping GUIs to auditory interfaces , 1992, UIST '92.

[19]  Tilman Dingler,et al.  Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech , 2008 .

[20]  Elizabeth D. Mynatt Transforming Graphical Interfaces Into Auditory Interfaces for Blind Users , 1997, Hum. Comput. Interact..

[21]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[22]  Ben Shneiderman,et al.  Split menus: effectively using selection frequency to organize menus , 1994, TCHI.

[23]  Stephen A. Brewster,et al.  Correcting menu usability problems with sound , 1999, Behav. Inf. Technol..

[24]  T. V. Raman Auditory User Interfaces: Toward the Speaking Computer , 1997 .

[25]  Bruce N. Walker,et al.  Advanced auditory menus: design and evaluation of auditory scroll bars , 2008, Assets '08.

[26]  Tohru Ifukube,et al.  Maximum listening speeds for the blind , 2003 .

[27]  J. Trouvain,et al.  COMPREHENSION OF ULTRA-FAST SPEECH - BLIND VS. "NORMALLY HEARING" PERSONS , 2007 .

[28]  Alistair D. N. Edwards,et al.  Improving the usability of speech-based interfaces for blind users , 1996, Assets '96.

[29]  Stephen A. Brewster,et al.  Earcons as a Method of Providing Navigational Cues in a Menu Hierarchy , 1996, BCS HCI.

[30]  Ben Shneiderman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[31]  Stephen A. Brewster,et al.  Navigating Telephone-Based Interfaces with Earcons , 1997, BCS HCI.

[32]  William W. Gaver The SonicFinder: An Interface That Uses Auditory Icons , 1989, Hum. Comput. Interact..

[33]  Bruce N. Walker,et al.  EFFICIENCY OF SPEARCON-ENHANCED NAVIGATION OF ONE DIMENSIONAL ELECTRONIC MENUS , 2008 .

[34]  Benjamin B. Bederson,et al.  Fisheye menus , 2000, UIST '00.

[35]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[36]  Elizabeth D. Mynatt,et al.  The design and evaluation of an auditory-enhanced scrollbar , 1994, CHI '94.

[37]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[38]  Catherine G. Wolf,et al.  Ubiquitous mail: speech and graphical user interfaces to an integrated voice/e-mail mailbox , 1995, INTERACT.

[39]  Joanna McGrenere,et al.  A comparison of static, adaptive, and adaptable menus , 2004, CHI.

[40]  Helen Petrie,et al.  Auditory navigation in hyperspace: Design and evaluation of a non-visual hypermedia system for blind users , 1999, Behav. Inf. Technol..

[41]  Takashi Itoh,et al.  User interface of a Home Page Reader , 1998, Assets '98.

[42]  Donald Joseph Hejna,et al.  Real-time time-scale modification of speech via the synchronized overlap-add algorithm , 1990 .

[43]  Sven Anderson,et al.  COMBINING SPEECH AND EARCONS TO ASSIST MENU NAVIGATION , 2003 .

[44]  Daryle Jean Gardner-Bonneau Human Factors Problems in Interactive Voice Response (IVR) Applications: Do we Need a Guideline/Standard? , 1992 .