Review of Self-Motion in the Context of Hearing and Hearing Device Research.

The benefit from directional hearing devices predicted in the lab often differs from reported user experience, suggesting that laboratory findings lack ecological validity. This difference may be partly caused by differences in self-motion between the lab and real-life environments. This literature review aims to provide an overview of the methods used to measure and quantify self-motion, the test environments, and the measurement paradigms. Self-motion is the rotation and translation of the head and torso and movement of the eyes. Studies were considered which explicitly assessed or controlled self-motion within the scope of hearing and hearing device research. The methods and outcomes of the reviewed studies are compared and discussed in relation to ecological validity. The reviewed studies demonstrate interactions between hearing device benefit and self-motion, such as a decreased benefit from directional microphones due to a more natural head movement when the test environment and task include realistic complexity. Identified factors associated with these interactions include the presence of audiovisual cues in the environment, interaction with conversation partners, and the nature of the tasks being performed. This review indicates that although some aspects of the interactions between self-motion and hearing device benefit have been shown and many methods for assessment and analysis of self-motion are available, it is still unclear to what extent individual factors affect the ecological validity of the findings. Further research is required to relate lab-based measures of self-motion to the individual's real-life hearing ability.

[1]  Gary W. Elko,et al.  A simple adaptive first-order differential microphone , 1995, Proceedings of 1995 Workshop on Applications of Signal Processing to Audio and Accoustics.

[2]  Douglas S. Brungart,et al.  The Quest for Ecological Validity in Hearing Science: What It Is, Why It Matters, and How to Advance It , 2020, Ear and Hearing.

[3]  A. M. Mimpen,et al.  Effect of the orientation of the speaker's head and azimuth of a noise source on the speech reception threshold for sentences , 1980 .

[4]  Gerhard Tröster,et al.  Recognition of Hearing Needs from Body and Eye Movements to Improve Hearing Instruments , 2011, Pervasive.

[5]  William M Whitmer,et al.  Speech, movement, and gaze behaviours during dyadic conversation in noise , 2019, Scientific Reports.

[6]  W. T. Nelson,et al.  A speech corpus for multitalker communications research. , 2000, The Journal of the Acoustical Society of America.

[7]  W. O. Brimijoin,et al.  Auditory and visual orienting responses in listeners with and without hearing-impairment. , 2010, The Journal of the Acoustical Society of America.

[8]  Mohsen Rahmani,et al.  Binaural source separation based on spatial cues and maximum likelihood model adaptation , 2015, Digit. Signal Process..

[9]  Karolina Smeds,et al.  Common Sound Scenarios: A Context-Driven Categorization of Everyday Sound Environments for Application in Hearing-Device Research. , 2016, Journal of the American Academy of Audiology.

[10]  R. Bentler Effectiveness of directional microphones and noise reduction schemes in hearing aids: a systematic review of the evidence. , 2005, Journal of the American Academy of Audiology.

[11]  William M. Whitmer,et al.  The Effect of Hearing Aid Microphone Mode on Performance in an Auditory Orienting Task , 2014, Ear and hearing.

[12]  Bernd Porr,et al.  Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography , 2018, PloS one.

[13]  Torsten Dau,et al.  Sound source localization with varying amount of visual information in virtual reality , 2018, bioRxiv.

[14]  A M Amlani,et al.  Efficacy of directional microphone hearing aids: a meta-analytic perspective. , 2001, Journal of the American Academy of Audiology.

[15]  Tim Brookes,et al.  An Investigation Into Head Movements Made When Evaluating Various Attributes of Sound , 2007 .

[16]  Michael A Akeroyd,et al.  The Effects of Hearing Impairment, Age, and Hearing Aids on the Use of Self-Motion for Determining Front/Back Location. , 2016, Journal of the American Academy of Audiology.

[17]  M. Sommers,et al.  Audiovisual Integration and Lipreading Abilities of Older Adults with Normal and Impaired Hearing , 2007 .

[18]  Gerard Llorach,et al.  Movement and Gaze Behavior in Virtual Audiovisual Listening Environments Resembling Everyday Life , 2019, Trends in hearing.

[19]  Virginia Best,et al.  The Benefit of a Visually Guided Beamformer in a Dynamic Speech Task , 2017, Trends in hearing.

[20]  Giso Grimm,et al.  The Virtual Reality Lab: Realization and Application of Virtual Sound Environments , 2020, Ear and hearing.

[21]  Mary T Cord,et al.  Relationship between laboratory measures of directional advantage and everyday success with directional microphone hearing aids. , 2004, Journal of the American Academy of Audiology.

[22]  Gitte Keidser,et al.  Conversational Interaction Is the Brain in Action: Implications for the Evaluation of Hearing and Hearing Interventions. , 2020, Ear and hearing.

[23]  Thomas Lunner,et al.  On the Interaction of Head and Gaze Control With Acoustic Beam Width of a Simulated Beamformer in a Two-Talker Scenario , 2019, Trends in hearing.

[24]  Jennifer L. Campos,et al.  Effects of Age on Dual-Task Walking While Listening , 2018, Journal of motor behavior.

[25]  Linda Drijvers,et al.  Visual Context Enhanced: The Joint Contribution of Iconic Gestures and Visible Speech to Degraded Speech Comprehension. , 2017, Journal of speech, language, and hearing research : JSLHR.

[26]  Tim Brookes,et al.  Head Movements Made by Listeners in Experimental and Real-Life Listening Activities , 2013 .

[27]  Gerard Llorach,et al.  Towards Realistic Immersive Audiovisual Simulations for Hearing Research: Capture, Virtual Scenes and Reproduction , 2018, AVSU@MM.

[28]  Simon Doclo,et al.  Feedback Control in Hearing Aids , 2008 .

[29]  Roel Vertegaal,et al.  The Attentive Hearing Aid: Eye Selection of Auditory Sources for Hearing Impaired Users , 2009, INTERACT.

[30]  W. Owen Brimijoin,et al.  The minimum monitoring signal-to-noise ratio for off-axis signals and its implications for directional hearing aids , 2018, Hearing Research.

[31]  Giso Grimm,et al.  A gaze-based attention model for spatially-aware hearing aids , 2018, ITG Symposium on Speech Communication.

[32]  Jennifer L. Campos,et al.  Effects of Hearing Loss on Dual-Task Performance in an Audiovisual Virtual Reality Simulation of Listening While Walking. , 2016, Journal of the American Academy of Audiology.

[33]  Giso Grimm,et al.  Evaluation of spatial audio reproduction schemes for application in hearing aid research , 2015, ArXiv.

[34]  Jennifer L. Campos,et al.  Hearing, self-motion perception, mobility, and aging , 2018, Hearing Research.

[35]  Henning Puder,et al.  Signal Processing in High-End Hearing Aids: State of the Art, Challenges, and Future Trends , 2005, EURASIP J. Adv. Signal Process..

[37]  Gerard Llorach,et al.  Influence of visual cues on head and eye movements during listening tasks in multi-talker audiovisual environments with animated characters , 2018, Speech Commun..

[38]  Franz Gravenhorst,et al.  Ear-worn reference data collection and annotation for multimodal context-aware hearing instruments , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[39]  Jeffery A. Jones,et al.  Visual Prosody and Speech Intelligibility , 2004, Psychological science.

[40]  Virginia Best,et al.  An evaluation of the performance of two binaural beamformers in complex and dynamic multitalker environments , 2015, International journal of audiology.

[41]  Jacques A Grange,et al.  The benefit of head orientation to speech intelligibility in noise. , 2016, The Journal of the Acoustical Society of America.

[42]  Giso Grimm,et al.  Evaluation of the Influence of Head Movement on Hearing Aid Algorithm Performance Using Acoustic Simulations , 2020, Trends in hearing.

[43]  T. Dau,et al.  Improving Speech Intelligibility by Hearing Aid Eye-Gaze Steering: Conditions With Head Fixated in a Multitalker Environment , 2018, Trends in Hearing.

[44]  Chris Oreinos,et al.  Objective analysis of ambisonics for hearing aid applications: Effect of listener's head, room reverberation, and directional microphones. , 2015, The Journal of the Acoustical Society of America.

[45]  J. Oleson,et al.  Efficacy and Effectiveness of Advanced Hearing Aid Directional and Noise Reduction Technologies for Older Adults With Mild to Moderate Hearing Loss. , 2018, Ear and hearing.

[46]  T Ricketts,et al.  Impact of Noise Source Configuration on Directional Hearing Aid Benefit and Performance , 2000, Ear and hearing.