Using Sounds to Present and Manage Information in Computers

The auditive modality, such as speech, signals and natural sounds, is one of the most important ways to present and communicate information. However, in computer interfaces the possibilities of auditive modality have been almost totally neglected. Usually the audio consists of simple signals (beeps and clicks) or background music. The present paper outlines some of the possibilities in presenting and managing information in computers by using audio from the perspective of the semiotic theory of signs. Auditive interfaces can be especially useful for people with visual or kinaesthetic disabilities, as well as in places and with devices when the visual-kinaesthetic using of the machine is difficult, for example while on the move or with small display devices.

[1]  Sara Bly,et al.  Presenting information in sound , 1982, CHI '82.

[2]  Terry Winograd,et al.  Representing structured information in audio interfaces: a framework for selecting audio marking techniques to represent document structures , 1998 .

[3]  Stephen A. Brewster,et al.  Spatial audio in small screen device displays , 2000, Personal Technologies.

[4]  W. Buxton Human-Computer Interaction , 1988, Springer Berlin Heidelberg.

[5]  T. Turino,et al.  Signs of Imagination, Identity, and Experience: A Peircian Semiotic Theory for Music , 1999 .

[6]  T. Landauer,et al.  Handbook of Human-Computer Interaction , 1997 .

[7]  Kari Kallinen,et al.  Emotional Responses to Single-Voice Melodies: Implications for Mobile Ringtones , 2003, INTERACT.

[8]  Jeffrey A. Gray,et al.  The neuropsychology of temperament. , 1991 .

[9]  Murray Crease,et al.  Making progress with sounds - the design & evaluation of an audio progress bar , 1998 .

[10]  Stephen Brewster,et al.  A Detailed Investigation into the Effectiveness of Earcons , 1997 .

[11]  Frankie James Presenting HTML Structure in Audio: User Satisfaction with Audio Hypertext , 1998 .

[12]  Stephen Brewster,et al.  Non-visual interfaces for wearable computers , 2000 .

[13]  Stephen Brewster,et al.  Providing a Structured Method for Integrating Non-Speech Audio into Human-Computer Interfaces , 1994 .

[14]  Kari Kallinen,et al.  Reading news from a pocket computer in a distracting environment: effects of the tempo of background music , 2002, Comput. Hum. Behav..

[15]  William W. Gaver The SonicFinder: An Interface That Uses Auditory Icons , 1989, Hum. Comput. Interact..

[16]  William W. Gaver What in the World Do We Hear? An Ecological Approach to Auditory Event Perception , 1993 .

[17]  Alan Blackwell Human Computer Interaction Notes , 2001 .

[18]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[19]  Elizabeth D. Mynatt Auditory Presentation of Graphical User Interfaces , 1992 .

[20]  Gregory Kramer,et al.  Auditory Display: Sonification, Audification, And Auditory Interfaces , 1994 .

[21]  R. Sommer,et al.  Seating arrangements and status. , 1967, Journal of personality and social psychology.

[22]  Albert Mehrabian,et al.  Models for affiliative and conformity behavior. , 1970 .

[23]  Stephen A. Brewster,et al.  An evaluation of earcons for use in auditory human-computer interfaces , 1993, INTERCHI.