Effects of auditory reliability and ambiguous visual stimuli on auditory spatial discrimination

The brain combines information from multiple sensory modalities to interpret the environment. Multisensory integration is often modeled by ideal Bayesian causal inference, a model proposing that perceptual decisions arise from a statistical weighting of information from each sensory modality based on its reliability and relevance to the observer’s task. However, ideal Bayesian causal inference fails to describe human behavior in a simultaneous auditory spatial discrimination task in which spatially aligned visual stimuli improve performance despite providing no information about the correct response. This work tests the hypothesis that humans weight auditory and visual information in this task based on their relative reliabilities, even though the visual stimuli are task-uninformative, carrying no information about the correct response, and should be given zero weight. Listeners perform an auditory spatial discrimination task with relative reliabilities modulated by the stimulus durations. By comparing conditions in which task-uninformative visual stimuli are spatially aligned with auditory stimuli or centrally located (control condition), listeners are shown to have a larger multisensory effect when their auditory thresholds are worse. Even in cases in which visual stimuli are not task-informative, the brain combines sensory information that is scene-relevant, especially when the task is difficult due to unreliable auditory information.

[1]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[2]  Daniel R. Little,et al.  Small is beautiful: In defense of the small-N design , 2018, Psychonomic Bulletin & Review.

[3]  Skipper Seabold,et al.  Statsmodels: Econometric and Statistical Modeling with Python , 2010, SciPy.

[4]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[5]  Erik Blaser,et al.  Tracking an object through feature space , 2000, Nature.

[6]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[7]  E R Hafter,et al.  Difference thresholds for interaural delay. , 1975, The Journal of the Acoustical Society of America.

[8]  I. Howard,et al.  Human Spatial Orientation , 1966 .

[9]  A. Mills On the minimum audible angle , 1958 .

[10]  Ralf M Haefner,et al.  Task-uninformative visual stimuli improve auditory spatial discrimination in humans but not the ideal observer , 2019, PloS one.

[11]  John C. Middlebrooks,et al.  Stream segregation with high spatial acuity. , 2012, The Journal of the Acoustical Society of America.

[12]  C Kaernbach,et al.  Simple adaptive testing with the weighted up-down method , 1991, Perception & psychophysics.

[13]  Adrian K. C. Lee,et al.  Directing Eye Gaze Enhances Auditory Spatial Cue Discrimination , 2014, Current Biology.

[14]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.