Recognition of Hearing Needs from Body and Eye Movements to Improve Hearing Instruments

Hearing instruments (HIs) have emerged as true pervasive computers as they continuously adapt the hearing program to the user's context. However, current HIs are not able to distinguish different hearing needs in the same acoustic environment. In this work, we explore how information derived from body and eye movements can be used to improve the recognition of such hearing needs. We conduct an experiment to provoke an acoustic environment in which different hearing needs arise: active conversation and working while colleagues are having a conversation in a noisy office environment. We record body movements on nine body locations, eye movements using electrooculography (EOG), and sound using commercial HIs for eleven participants. Using a support vector machine (SVM) classifier and person-independent training we improve the accuracy of 77% based on sound to an accuracy of 92% using body movements. With a view to a future implementation into a HI we then perform a detailed analysis of the sensors attached to the head. We achieve the best accuracy of 86% using eye movements compared to 84% for head movements. Our work demonstrates the potential of additional sensor modalities for future HIs and motivates to investigate the wider applicability of this approach on further hearing situations and needs.

[1]  M. W. van der Molen,et al.  Does the heart know what the ears hear? A heart rate analysis of auditory selective attention. , 1996, Psychophysiology.

[2]  Masaaki Fukumoto,et al.  Full-time wearable headphone-type gaze detector , 2006, CHI Extended Abstracts.

[3]  Sergei Kochkin,et al.  MarkeTrak VIII: The efficacy of hearing aids in achieving compensation equity in the workplace , 2010 .

[4]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Ann Marie Fiore Benefits of Wireless Technology for Undergraduate Programs , 2002 .

[6]  Trevor Darrell,et al.  Contextual recognition of head gestures , 2005, ICMI '05.

[7]  Roel Vertegaal,et al.  The Attentive Hearing Aid: Eye Selection of Auditory Sources for Hearing Impaired Users , 2009, INTERACT.

[8]  Chih-Jen Lin,et al.  LIBLINEAR: A Library for Large Linear Classification , 2008, J. Mach. Learn. Res..

[9]  Gerhard Tröster,et al.  Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments , 2009, J. Ambient Intell. Smart Environ..

[10]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Henning Puder,et al.  Signal Processing in High-End Hearing Aids: State of the Art, Challenges, and Future Trends , 2005, EURASIP J. Adv. Signal Process..

[12]  Alex Pentland,et al.  Sensing and modeling human networks using the sociometer , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[13]  S. Kochkin MarkeTrak VIII: 25-Year Trends in the Hearing Health Market , 2009 .

[14]  HamacherV.,et al.  Signal processing in high-end hearing aids , 2005 .

[15]  Hans-Werner Gellersen,et al.  Multimodal recognition of reading activity in transit using body-worn sensors , 2012, TAP.

[16]  Josef Shargorodsky,et al.  Change in prevalence of hearing loss in US adolescents. , 2010, JAMA.

[17]  B. Shinn-Cunningham,et al.  Selective Attention in Normal and Impaired Hearing , 2008, Trends in amplification.

[18]  Guang-Zhong Yang,et al.  Detecting Walking Gait Impairment with an Ear-worn Sensor , 2009, 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks.

[19]  Gerhard Tröster,et al.  Towards multi-modal context recognition for hearing instruments , 2010, International Symposium on Wearable Computers (ISWC) 2010.

[20]  Gitte Keidser Many factors are involved in optimizing environmentally adaptive hearing aids , 2009 .

[21]  Arthur Schaub Digital Hearing Aids , 2011 .

[22]  Barbara G. Shinn-Cunningham,et al.  "I want to party, but my hearing aids won't let me!" , 2009 .

[23]  Norbert Dillier,et al.  Sound Classification in Hearing Aids Inspired by Auditory Scene Analysis , 2005, EURASIP J. Adv. Signal Process..

[24]  U. Hadar,et al.  Head movement during listening turns in conversation , 1985 .

[25]  Marco Winckler,et al.  Human-Computer Interaction - INTERACT 2009, 12th IFIP TC 13 International Conference, Uppsala, Sweden, August 24-28, 2009, Proceedings, Part I , 2009, INTERACT.

[26]  Paul Lukowicz,et al.  Wearable Activity Tracking in Car Manufacturing , 2008, IEEE Pervasive Computing.

[27]  Paul Lukowicz,et al.  Rapid Prototyping of Activity Recognition Applications , 2008, IEEE Pervasive Computing.

[28]  Sergei Kochkin,et al.  MarkeTrak VIII: Consumer satisfaction with hearing aids is slowly increasing , 2010 .