The Virtual Reality Lab: Realization and Application of Virtual Sound Environments

To assess perception with and performance of modern and future hearing devices with advanced adaptive signal processing capabilities, novel evaluation methods are required that go beyond already established methods. These novel methods will simulate to a certain extent the complexity and variability of acoustic conditions and acoustic communication styles in real life. This article discusses the current state and the perspectives of virtual reality technology use in the lab for designing complex audiovisual communication environments for hearing assessment and hearing device design and evaluation. In an effort to increase the ecological validity of lab experiments, that is, to increase the degree to which lab data reflect real-life hearing-related function, and to support the development of improved hearing-related procedures and interventions, this virtual reality lab marks a transition from conventional (audio-only) lab experiments to the field. The first part of the article introduces and discusses the notion of the communication loop as a theoretical basis for understanding the factors that are relevant for acoustic communication in real life. From this, requirements are derived that allow an assessment of the extent to which a virtual reality lab reflects these factors, and which may be used as a proxy for ecological validity. The most important factor of real-life communication identified is a closed communication loop among the actively behaving participants. The second part of the article gives an overview of the current developments towards a virtual reality lab at Oldenburg University that aims at interactive and reproducible testing of subjects with and without hearing devices in challenging communication conditions. The extent to which the virtual reality lab in its current state meets the requirements defined in the first part is discussed, along with its limitations and potential further developments. Finally, data are presented from a qualitative study that compared subject behavior and performance in two audiovisual environments presented in the virtual reality lab—a street and a cafeteria—with the corresponding field environments. The results show similarities and differences in subject behavior and performance between the lab and the field, indicating that the virtual reality lab in its current state marks a step towards more ecological validity in lab-based hearing and hearing device research, but requires further development towards higher levels of ecological validity.

[1]  Gerard Llorach,et al.  Movement and Gaze Behavior in Virtual Audiovisual Listening Environments Resembling Everyday Life , 2019, Trends in hearing.

[2]  Michael Vorländer,et al.  An Extended Binaural Real-Time Auralization System With an Interface to Research Hearing Aids for Experiments on Subjects With Hearing Loss , 2018, Trends in hearing.

[3]  C. Frith,et al.  Making up the mind , 2009 .

[4]  S. Debener,et al.  Concealed, Unobtrusive Ear-Centered EEG Acquisition: cEEGrids for Transparent EEG , 2017, Front. Hum. Neurosci..

[5]  J. Oleson,et al.  Efficacy and Effectiveness of Advanced Hearing Aid Directional and Noise Reduction Technologies for Older Adults With Mild to Moderate Hearing Loss. , 2018, Ear and hearing.

[6]  Giso Grimm,et al.  Spatial Acoustic Scenarios in Multichannel Loudspeaker Systems for Hearing Aid Evaluation. , 2016, Journal of the American Academy of Audiology.

[7]  Jennifer L. Campos,et al.  Effects of Hearing Loss on Dual-Task Performance in an Audiovisual Virtual Reality Simulation of Listening While Walking. , 2016, Journal of the American Academy of Audiology.

[8]  DeLiang Wang,et al.  A New Framework for CNN-Based Speech Enhancement in the Time Domain , 2019, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[9]  Mary T Cord,et al.  Relationship between laboratory measures of directional advantage and everyday success with directional microphone hearing aids. , 2004, Journal of the American Academy of Audiology.

[10]  M. Meis,et al.  The technization of self-care in hearing aid research , 2018 .

[11]  Aaron J. HELLER The Ambisonic Decoder Toolbox : Extensions for Partial-Coverage Loudspeaker Arrays , 2014 .

[12]  Jörg M. Buchholz,et al.  Validation of realistic acoustic environments for listening tests using directional hearing aids , 2014, 2014 14th International Workshop on Acoustic Signal Enhancement (IWAENC).

[13]  Gitte Keidser,et al.  Conversational Interaction Is the Brain in Action: Implications for the Evaluation of Hearing and Hearing Interventions. , 2020, Ear and hearing.

[14]  Todd R. Jennings,et al.  A visually guided beamformer to aid listening in complex acoustic environments , 2018 .

[15]  G. Fink,et al.  It's in your eyes--using gaze-contingent stimuli to create truly interactive paradigms for social cognitive and affective neuroscience. , 2010, Social cognitive and affective neuroscience.

[16]  William M Whitmer,et al.  Speech, movement, and gaze behaviours during dyadic conversation in noise , 2019, Scientific Reports.

[17]  Gerard Llorach,et al.  Influence of visual cues on head and eye movements during listening tasks in multi-talker audiovisual environments with animated characters , 2018, Speech Commun..

[18]  N. Lesica Why Do Hearing Aids Fail to Restore Normal Auditory Perception? , 2018, Trends in Neurosciences.

[19]  J. Buchholz,et al.  Hearing Aid Amplification Reduces Communication Effort of People With Hearing Impairment and Their Conversation Partners. , 2020, Journal of speech, language, and hearing research : JSLHR.

[20]  Giso Grimm,et al.  Review of Self-Motion in the Context of Hearing and Hearing Device Research. , 2020, Ear and hearing.

[21]  Matthias Husinsky,et al.  Virtual Stage: Interactive Puppeteering in Mixed Reality , 2018, 2018 IEEE 1st Workshop on Animation in Virtual and Augmented Environments (ANIVAE).

[22]  清造 村上 REFERENCE NOTE , 1958, Domenico Scarlatti.

[23]  R. Bentler Effectiveness of directional microphones and noise reduction schemes in hearing aids: a systematic review of the evidence. , 2005, Journal of the American Academy of Audiology.

[24]  J. Bailenson,et al.  Social interaction in augmented reality , 2019, PloS one.

[25]  Richard Paluch Die technisch vermittelte Umweltbeziehung des leiblichen Selbstes in virtuellen Welten , 2019, Mensch und Welt im Zeichen der Digitalisierung.

[26]  Gerard Llorach,et al.  Web-Based Live Speech-Driven Lip-Sync , 2016, 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES).

[27]  Giso Grimm,et al.  Ethnographic research: The interrelation of spatial awareness, everyday life, laboratory environments, and effects of hearing aids , 2017 .

[28]  Jean-Luc Schwartz,et al.  Adverse conditions improve distinguishability of auditory, motor, and perceptuo-motor theories of speech perception: An exploratory Bayesian modelling study , 2012 .

[29]  Hong Zhang,et al.  Facial expression recognition via learning deep sparse autoencoders , 2018, Neurocomputing.

[30]  Ilona Straub,et al.  ‘It looks like a human!’ The interrelation of social presence, interaction and agency ascription: a case study about the effects of an android robot on social agency ascription , 2016, AI & SOCIETY.

[31]  Giso Grimm,et al.  Moving from the field to the lab: towards ecological validity of audio-visual simulations in the laboratory to meet individual behavior patterns and preferences. , 2017 .

[32]  Ton Roosendaal,et al.  The Official Blender Game Kit: Interactive 3d for Artists , 2003 .

[33]  Jennifer L. Campos,et al.  From Healthy Hearing to Healthy Living: A Holistic Approach. , 2020, Ear and hearing.

[34]  Giso Grimm,et al.  A gaze-based attention model for spatially-aware hearing aids , 2018, ITG Symposium on Speech Communication.

[35]  Karolina Smeds,et al.  Common Sound Scenarios: A Context-Driven Categorization of Everyday Sound Environments for Application in Hearing-Device Research. , 2016, Journal of the American Academy of Audiology.

[36]  Justin A. Zakis,et al.  Preferred overall loudness. II: Listening through hearing aids in field and laboratory tests , 2006, International journal of audiology.

[37]  Giso Grimm,et al.  Evaluation of the Influence of Head Movement on Hearing Aid Algorithm Performance Using Acoustic Simulations , 2020, Trends in hearing.

[38]  Torsten Dau,et al.  Sound source localization with varying amount of visual information in virtual reality , 2018, bioRxiv.

[39]  J. Daniel,et al.  Représentation de champs acoustiques, application à la transmission et à la reproduction de scènes sonores complexes dans un contexte multimédia , 2000 .

[40]  Giso Grimm,et al.  Evaluation of spatial audio reproduction schemes for application in hearing aid research , 2015, ArXiv.

[41]  Gerard Llorach,et al.  Say Hi to Eliza - An Embodied Conversational Agent on the Web , 2017, IVA.

[42]  Douglas S. Brungart,et al.  The Quest for Ecological Validity in Hearing Science: What It Is, Why It Matters, and How to Advance It , 2020, Ear and Hearing.

[43]  J. Buchholz,et al.  Eliciting Naturalistic Conversations: A Method for Assessing Communication Ability, Subjective Experience, and the Impacts of Noise and Hearing Impairment. , 2019, Journal of speech, language, and hearing research : JSLHR.

[44]  Gerard Llorach,et al.  Towards Realistic Immersive Audiovisual Simulations for Hearing Research: Capture, Virtual Scenes and Reproduction , 2018, AVSU@MM.

[45]  A. Strauss,et al.  The social psychology of George Herbert Mead , 1956 .

[46]  Gerard Llorach,et al.  Web-Based Embodied Conversational Agents and Older People , 2019, Human–Computer Interaction Series.

[47]  Karolina Smeds,et al.  Selecting Scenarios for Hearing-Related Laboratory Testing. , 2020, Ear and hearing.

[48]  S. Shiffman,et al.  Ecological momentary assessment. , 2008, Annual review of clinical psychology.

[49]  Giso Grimm,et al.  A toolbox for rendering virtual acoustic environments in the context of audiology , 2018, Acta Acustica united with Acustica.