Task-uninformative visual stimuli improve auditory spatial discrimination in humans but not the ideal observer

In order to survive and function in the world, we must understand the content of our environment. This requires us to gather and parse complex, sometimes conflicting, information. Yet, the brain is capable of translating sensory stimuli from disparate modalities into a cohesive and accurate percept with little conscious effort. Previous studies of multisensory integration have suggested that the brain’s integration of cues is well-approximated by an ideal observer implementing Bayesian causal inference. However, behavioral data from tasks that include only one stimulus in each modality fail to capture what is in nature a complex process. Here we employed an auditory spatial discrimination task in which listeners were asked to determine on which side they heard one of two concurrently presented sounds. We compared two visual conditions in which task-uninformative shapes were presented in the center of the screen, or spatially aligned with the auditory stimuli. We found that performance on the auditory task improved when the visual stimuli were spatially aligned with the auditory stimuli—even though the shapes provided no information about which side the auditory target was on. We also demonstrate that a model of a Bayesian ideal observer performing causal inference cannot explain this improvement, demonstrating that humans deviate systematically from the ideal observer model.

[1]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[2]  Rainer Goebel,et al.  Task‐irrelevant visual letters interact with the processing of speech sounds in heteromodal and unimodal cortex , 2008, The European journal of neuroscience.

[3]  Robert A Jacobs,et al.  Bayesian integration of visual and auditory signals for spatial localization. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[4]  Ulrik R. Beierholm,et al.  Probability Matching as a Computational Strategy Used in Perception , 2010, PLoS Comput. Biol..

[5]  M. Woldorff,et al.  Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? , 2006, Cerebral cortex.

[6]  Erik Blaser,et al.  Tracking an object through feature space , 2000, Nature.

[7]  Adrian K. C. Lee,et al.  Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms , 2016, Trends in Neurosciences.

[8]  Ladan Shams,et al.  Bayesian priors are encoded independently from likelihoods in human multisensory perception. , 2009, Journal of vision.

[9]  Adrian K. C. Lee,et al.  Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding , 2017, Neuron.

[10]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[11]  U. Noppeney,et al.  Cortical Hierarchies Perform Bayesian Causal Inference in Multisensory Perception , 2015, PLoS biology.

[12]  J. Mattingley,et al.  Perceptual load influences auditory space perception in the ventriloquist aftereffect , 2011, Cognition.

[13]  Virginia Best,et al.  The influence of spatial separation on divided listening. , 2006, The Journal of the Acoustical Society of America.

[14]  Sethu Vijayakumar,et al.  Multisensory Oddity Detection as Bayesian Inference , 2009, PloS one.

[15]  Ulrik R Beierholm,et al.  Sound-induced flash illusion as an optimal percept , 2005, Neuroreport.

[16]  G. Recanzone,et al.  Temporal and spatial dependency of the ventriloquism effect , 2001, Neuroreport.

[17]  Endel Põder,et al.  Crowding with detection and coarse discrimination of simple visual features. , 2008, Journal of vision.

[18]  Jennifer K Bizley,et al.  Where are multisensory signals combined for perceptual decision-making? , 2016, Current Opinion in Neurobiology.

[19]  David R. Wozny,et al.  Human trimodal perception follows optimal statistical inference. , 2008, Journal of vision.

[20]  J. Navarra,et al.  Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task , 2004, Cognition.

[21]  Roger W Li,et al.  The receptive field and internal noise for position acuity change with feature separation. , 2006, Journal of vision.

[22]  Christopher T. Lovelace,et al.  An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. , 2003, Brain research. Cognitive brain research.

[23]  Frederick J. Gallun,et al.  Task-dependent costs in processing two simultaneous auditory stimuli , 2007, Perception & psychophysics.

[24]  Jean-Pierre Bresciani,et al.  Vision and touch are automatically integrated for the perception of sequences of events. , 2006, Journal of vision.

[25]  Adrian K. C. Lee,et al.  Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners , 2015, eLife.

[26]  Ahna R Girshick,et al.  Probabilistic combination of slant information: weighted averaging and robustness as optimal percepts. , 2009, Journal of vision.

[27]  Russell L. Martin,et al.  Interpolation of Head-Related Transfer Functions , 2007 .

[28]  C. Avendano,et al.  The CIPIC HRTF database , 2001, Proceedings of the 2001 IEEE Workshop on the Applications of Signal Processing to Audio and Acoustics (Cat. No.01TH8575).

[29]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[30]  Uta Noppeney,et al.  Distinct Computational Principles Govern Multisensory Integration in Primary Sensory and Association Cortices , 2016, Current Biology.

[31]  Luigi Acerbi,et al.  Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search , 2017, NIPS.

[32]  Eric Larson,et al.  The cortical dynamics underlying effective switching of auditory spatial attention , 2013, NeuroImage.