Encoding of Information in Auditory Displays: Initial Research on Flexibility and Processing Codes in Dual-task Scenarios

Interest in the use of sound as a means of information display in human-machine systems has surged in recent years. While researchers have begun to address issues surrounding good auditory display design as well as potential domains of application, little is known about the cognitive processes involved in interpreting auditory displays. In multi-tasking scenarios, dividing concurrent information display across modalities (e.g., vision and audition) may allow the human operator to receive (i.e., to sense and perceive) more information, yet higher-level conflicts in the encoding and representation of information may persist. Surprisingly few studies to date have examined auditory information display in dual-task scenarios. This study examined the flexibility of encoding of information and processing code conflicts in a dual-task paradigm with auditory graphs—a specific class of auditory displays that represent quantitative information with sound. Results showed that 1) patterns of dual-task interference were task dependent, and 2) a verbal interference task was relatively more disruptive to auditory graph performance than a visuospatial interference task, particularly for point estimation.

[1]  Bruce N Walker,et al.  Magnitude estimation of conceptual data dimensions for use in sonification. , 2002, Journal of experimental psychology. Applied.

[2]  John H. Flowers,et al.  Data sonification from the desktop: Should sound be part of standard data analysis software? , 2005, TAP.

[3]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[4]  B. Shinn-Cunningham,et al.  Note on informational masking (L) , 2003 .

[5]  B. Shinn-Cunningham,et al.  Note on informational masking. , 2003, The Journal of the Acoustical Society of America.

[6]  Benjamin K. Davison,et al.  Sonification Sandbox Reconstruction: Software Standard for Auditory Graphs , 2007 .

[7]  Bruce N. Walker,et al.  Data density and trend reversals in auditory graphs: Effects on point-estimation and trend-identification tasks , 2008, TAP.

[8]  R. Shepard,et al.  Mental Rotation of Three-Dimensional Objects , 1971, Science.

[9]  A. Baddeley Is working memory still working? , 2001, The American psychologist.

[10]  C D Wickens,et al.  Codes and Modalities in Multiple Resources: A Success and a Qualification , 1988, Human factors.

[11]  Mariko Mikumo Multi-encoding for Pitch Information of Tone Sequences , 1997 .

[12]  Gregory Kramer,et al.  An introduction to auditory display , 1994 .

[13]  S. Sternberg High-Speed Scanning in Human Memory , 1966, Science.

[14]  M. Peters,et al.  Applications of mental rotation figures of the Shepard and Metzler type and description of a mental rotation stimulus library , 2008, Brain and Cognition.

[15]  Bruce N. Walker,et al.  Listener, Task, and Auditory Graph: Toward a Conceptual Model of Auditory Graph Comprehension , 2007 .

[16]  Marcus Watson,et al.  Sonification Supports Eyes-Free Respiratory Monitoring and Task Time-Sharing , 2004, Hum. Factors.

[17]  Lorna M. Brown,et al.  DESIGN GUIDELINES FOR AUDIO PRESENTATION OF GRAPHS AND TABLES , 2003 .

[18]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[19]  M. Manosevitz High-Speed Scanning in Human Memory , .

[20]  Denis McKeown,et al.  Mapping Candidate Within-Vehicle Auditory Displays to Their Referents , 2007, Hum. Factors.

[21]  Bruce N. Walker,et al.  Consistency of magnitude estimations with conceptual data dimensions used for sonification , 2007 .

[22]  Bruce N. Walker,et al.  Effects of Auditory Context Cues and Training on Performance of a Point Estimation Sonification Task , 2005 .