Towards a model human cochlea: sensory substitution for crossmodal audio-tactile displays

We present a Model Human Cochlea (MHC): a sensory substitution technique for creating a crossmodal audio-touch display. This research is aimed at designing a chair-based interface to support deaf and hard of hearing users in experiencing musical content associated with film, and seeks to develop this multisensory crossmodal display as a framework for supporting research in enhancing sensory entertainment experiences for universal design. The MHC uses audio speakers as vibrotactile devices placed along the body to facilitate the expression of emotional elements that are associated with music. We present the results of our formative study, which compared the MHC to conventional audio speaker displays for communicating basic emotional information through touch. Results suggest that the separation of audio signals onto multiple vibrotactile channels is more effective at expressing emotional content than is possible using a complete audio signal as vibrotactile stimuli.

[1]  H. Tan,et al.  Frequency and amplitude discrimination along the kinestheticcutaneous continuum in the presence of masking stimuli. , 2006, The Journal of the Acoustical Society of America.

[2]  T. Dalgleish,et al.  Handbook of cognition and emotion , 1999 .

[3]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[4]  W.J. Tompkins,et al.  Electrotactile and vibrotactile displays for sensory substitution systems , 1991, IEEE Transactions on Biomedical Engineering.

[5]  M. Sile O'Modhrain,et al.  Cutaneous Grooves: Composing for the Sense of Touch , 2002 .

[6]  Norman I. Badler,et al.  Movies from music: Visualizing musical compositions , 1979, SIGGRAPH '79.

[7]  Kari Kallinen,et al.  Emotional ratings of music excerpts in the western art music repertoire and their self-organization in the Kohonen neural network , 2005 .

[8]  Hiroshi Ishii,et al.  ComTouch: design of a vibrotactile communication device , 2002, DIS '02.

[9]  Sethuraman Panchanathan,et al.  Distal object perception through haptic user interfaces for individuals who are blind , 2006, ASAC.

[10]  W. Thompson,et al.  A Comparison of Acoustic Cues in Music and Speech for Three Dimensions of Affect , 2006 .

[11]  Bruce N. Walker,et al.  The audio abacus: representing numerical values with nonspeech sound for the visually impaired , 2004, Assets '04.

[12]  Alex Pentland,et al.  Tactual displays for sensory substitution and wearable computers , 2005, SIGGRAPH Courses.

[13]  Takeo Kanade,et al.  WYSIWYF Display: A Visual/Haptic Interface to Virtual Environment , 1999, Presence.

[14]  Marcelo M. Wanderley,et al.  Vibrotactile Feedback in Digital Musical Instruments , 2006, NIME.

[15]  William Forde Thompson,et al.  Sensitivity to Tonality across the Pitch Range , 2007, Perception.

[16]  Junseok Park,et al.  A design of cell-based pin-array tactile display , 2005, ICAT '05.

[17]  Vincent Hayward,et al.  Display of virtual braille dots by lateral skin deformation: feasibility study , 2005, TAP.

[18]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[19]  J. C. Bliss,et al.  A direct translation reading aid for the blind , 1966 .

[20]  J. Pickett,et al.  COMMUNICATION OF SPEECH SOUNDS BY A TACTUAL VOCODER. , 1963, Journal of speech and hearing research.

[21]  Jarke J. van Wijk,et al.  A perceptually based spectral model for isotropic textures , 2006, TAP.

[22]  Marcelo M. Wanderley,et al.  A Systematic Approach to Musical vibrotactile feedback , 2007, ICMC.

[23]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[24]  Stephen A. Brewster,et al.  Feeling what you hear: tactile feedback for navigation of audio graphs , 2006, CHI.

[25]  Sethuraman Panchanathan,et al.  A visio-haptic wearable system for assisting individuals who are blind , 2006, ASAC.