Design of a multimodal hearing system

Hearing instruments (HIs) have become context-aware devices that analyze the acoustic environment in order to automatically adapt sound processing to the user’s current hearing wish. However, in the same acoustic environment an HI user can have different hearing wishes requiring different behaviors from the hearing instrument. In these cases, the audio signal alone contains too little contextual information to determine the user’s hearing wish. Additional modalities to sound can provide the missing information to improve the adaption. In this work, we review additional modalities to sound in HIs and present a prototype of a newly developed wireless multimodal hearing system. The platform takes into account additional sensor modalities such as the user’s body movement and location. We characterize the system regarding runtime, latency and reliability of the wireless connection, and point out possibilities arising from the novel approach.

[1]  Vesa T. Peltonen,et al.  Audio-based context recognition , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[2]  Gitte Keidser,et al.  In-situ audiometry: How low-frequency leakage can affect prescribed gain and perception. , 2011 .

[3]  Richard E. Ladner,et al.  Hearing Impairments , 2008, Web Accessibility.

[4]  Barbara G. Shinn-Cunningham,et al.  "I want to party, but my hearing aids won't let me!" , 2009 .

[5]  Gitte Keidser Many factors are involved in optimizing environmentally adaptive hearing aids , 2009 .

[6]  U. Hadar,et al.  Head movement during listening turns in conversation , 1985 .

[7]  Roel Vertegaal,et al.  The Attentive Hearing Aid: Eye Selection of Auditory Sources for Hearing Impaired Users , 2009, INTERACT.

[8]  Thomas Rohdenburg,et al.  Hearing Aid Technology and Multi-Media Devices , 2009 .

[9]  J M Weisenberger,et al.  Development and preliminary evaluation of an earmold sound-to-tactile aid for the hearing-impaired. , 1987, Journal of rehabilitation research and development.

[10]  B. Shinn-Cunningham,et al.  Selective Attention in Normal and Impaired Hearing , 2008, Trends in amplification.

[11]  Guy J. Brown,et al.  Computational Auditory Scene Analysis: Principles, Algorithms, and Applications , 2006 .

[12]  Harry Levitt,et al.  Digital hearing aids , 1987 .

[13]  Norbert Dillier,et al.  Sound Classification in Hearing Aids Inspired by Auditory Scene Analysis , 2005, EURASIP J. Adv. Signal Process..

[14]  Stefan Launer,et al.  Automatic Sound Classification Inspired by Auditory Scene Analysis , 2001 .

[15]  Daniel P. Siewiorek,et al.  Activity recognition and monitoring using multiple sensors on different body positions , 2006, International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06).

[16]  Henning Puder,et al.  Signal Processing in High-End Hearing Aids: State of the Art, Challenges, and Future Trends , 2005, EURASIP J. Adv. Signal Process..

[17]  Ad F. M. Snik,et al.  Implantable Hearing Devices for Conductive and Sensorineural Hearing Impairment , 2011 .

[18]  Kimberly Myles,et al.  Guidelines for Head Tactile Communication , 2010 .

[19]  Gerhard Tröster,et al.  BodyANT: miniature wireless sensors for naturalistic monitoring of daily activity , 2009, BODYNETS.

[20]  J. Millis Challenges ahead , 1998, Nature.

[21]  Alex Pentland,et al.  Sensing and modeling human networks using the sociometer , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[22]  Gerhard Tröster,et al.  Recognition of Hearing Needs from Body and Eye Movements to Improve Hearing Instruments , 2011, Pervasive.

[23]  D. Heylen Challenges ahead: head movements and other social acts during conversations , 2005 .

[24]  Emiliano Miluzzo,et al.  A survey of mobile phone sensing , 2010, IEEE Communications Magazine.

[25]  Josef Shargorodsky,et al.  Change in prevalence of hearing loss in US adolescents. , 2010, JAMA.

[26]  M. W. van der Molen,et al.  Does the heart know what the ears hear? A heart rate analysis of auditory selective attention. , 1996, Psychophysiology.

[27]  Guang-Zhong Yang,et al.  Detecting Walking Gait Impairment with an Ear-worn Sensor , 2009, 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks.

[28]  N. Cohen,et al.  Cochlear Implants , 2000 .

[29]  Hong Z. Tan,et al.  Driver Reaction Time to Tactile and Auditory Rear-End Collision Warnings While Talking on a Cell Phone , 2009, Hum. Factors.

[30]  N Marangos,et al.  COCHLEAR IMPLANTS , 1976, The Lancet.

[31]  Elizabeth M Fitzpatrick,et al.  The Benefits of Remote Microphone Technology for Adults with Cochlear Implants , 2009, Ear and hearing.

[32]  Gerhard Tröster,et al.  Identification of relevant multimodal cues to enhance context-aware hearing instruments , 2011, BODYNETS.

[33]  Blake S. Wilson,et al.  Cochlear implants: A remarkable past and a brilliant future , 2008, Hearing Research.

[34]  Gerhard Tröster,et al.  Improving Game Accessibility with Vibrotactile-Enhanced Hearing Instruments , 2012, ICCHP.

[35]  E Borg,et al.  Vibratory-coded directional analysis: evaluation of a three-microphone/four-vibrator DSP system. , 2001, Journal of rehabilitation research and development.

[36]  S. Kochkin MarkeTrak VIII: 25-Year Trends in the Hearing Health Market , 2009 .

[37]  Ann Marie Fiore Benefits of Wireless Technology for Undergraduate Programs , 2002 .