Mapping information to audio and tactile icons

We report the results of a study focusing on the meanings that can be conveyed by audio and tactile icons. Our research considers the following question: how can audio and tactile icons be designed to optimise congruence between crossmodal feedback and the type of information this feedback is intended to convey? For example, if we have a set of system warnings, confirmations, progress up-dates and errors: what audio and tactile representations best match the information or type of message? Is one modality more appropriate at presenting certain types of information than the other modality? The results of this study indicate that certain parameters of the audio and tactile modalities such as rhythm, texture and tempo play an important role in the creation of congruent sets of feedback when given a specific type of information to transmit. We argue that a combination of audio or tactile parameters derived from our results allows the same type of information to be derived through touch and sound with an intuitive match to the content of the message.

[1]  Bruce J. P. Mortimer,et al.  Vibrotactile transduction and transducers. , 2007, The Journal of the Acoustical Society of America.

[2]  Jan B. F. van Erp,et al.  Tactile Navigation Display , 2000, Haptic Human-Computer Interaction.

[3]  Stephen Brewster,et al.  Experimentally Derived Guidelines for the Creation of Earcons , 2001 .

[4]  F A GELDARD,et al.  Some neglected possibilities of communication. , 1960, Science.

[5]  Ben Shneiderman,et al.  Designing the user interface (videotape) , 1987 .

[6]  Lorna M. Brown,et al.  Multidimensional tactons for non-visual information presentation in mobile devices , 2006, Mobile HCI.

[7]  A. F. Rovers,et al.  Using active haptic feedback in everyday products , 2006 .

[8]  Niels Ole Bernsen,et al.  MODALITY THEORY: SUPPORTING MULTIMODAL INTERFACE DESIGN , 2009 .

[9]  J.B.F. van Erp,et al.  Vibro-Tactile Information Presentation in Automobiles , 2001 .

[10]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[11]  Ben Shneiderman,et al.  Designing The User Interface , 2013 .

[12]  Ivan Poupyrev,et al.  Tactile interfaces for small touch screens , 2003, UIST '03.

[13]  Angela Chang,et al.  Audio-haptic feedback in mobile phones , 2005, CHI Extended Abstracts.

[14]  David J. Lewkowicz,et al.  Development of intersensory perception in human infants. , 1994 .

[15]  Josef P. Rauschecker,et al.  The Development of Intersensory Perception: Comparative Perspectives , 1996, Journal of Cognitive Neuroscience.

[16]  Karon E. MacLean,et al.  Haptic phonemes: basic building blocks of haptic communication , 2006, ICMI '06.

[17]  Topi Kaaresoja,et al.  Feel-good touch: finding the most pleasant tactile feedback for a mobile touch screen button , 2008, ICMI '08.

[18]  Niels Ole Bernsen,et al.  MODALITY THEORY IN SUPPORT OF MULTIMODAL INTERFACE DESIGN , 1994, AAAI 1994.

[19]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[20]  R. R. Patterson,et al.  Guidelines for auditory warning systems on civil aircraft , 1982 .

[21]  Stephen A. Brewster,et al.  New parameters for tacton design , 2007, CHI Extended Abstracts.

[22]  Stephen A. Brewster,et al.  Designing audio and tactile crossmodal icons for mobile devices , 2007, ICMI '07.

[23]  J Edworthy,et al.  The semantic associations of acoustic parameters commonly used in the design of auditory information and warning signals. , 1995, Ergonomics.

[24]  Stephen A. Brewster,et al.  Understanding concurrent earcons: Applying auditory scene analysis principles to concurrent earcon recognition , 2004, TAP.

[25]  Henni Palomäki MEANINGS CONVEYED BY SIMPLE AUDITORY RHYTHMS , 2006 .

[26]  Karon E. MacLean,et al.  Perceptual Design of Haptic Icons , 2003 .