Self-motion leads to mandatory cue fusion across sensory modalities.

When perceiving properties of the world, we effortlessly combine multiple sensory cues into optimal estimates. Estimates derived from the individual cues are generally retained once the multisensory estimate is produced and discarded only if the cues stem from the same sensory modality (i.e., mandatory fusion). Does multisensory integration differ in that respect when the object of perception is one's own body, rather than an external variable? We quantified how humans combine visual and vestibular information for perceiving own-body rotations and specifically tested whether such idiothetic cues are subjected to mandatory fusion. Participants made extensive size comparisons between successive whole body rotations using only visual, only vestibular, and both senses together. Probabilistic descriptions of the subjects' perceptual estimates were compared with a Bayes-optimal integration model. Similarity between model predictions and experimental data echoed a statistically optimal mechanism of multisensory integration. Most importantly, size discrimination data for rotations composed of both stimuli was best accounted for by a model in which only the bimodal estimator is accessible for perceptual judgments as opposed to an independent or additive use of all three estimators (visual, vestibular, and bimodal). Indeed, subjects' thresholds for detecting two multisensory rotations as different from one another were, in pertinent cases, larger than those measured using either single-cue estimate alone. Rotations different in terms of the individual visual and vestibular inputs but quasi-identical in terms of the integrated bimodal estimate became perceptual metamers. This reveals an exceptional case of mandatory fusion of cues stemming from two different sensory modalities.

[1]  K. Nagai,et al.  SCN output drives the autonomic nervous system: with special reference to the autonomic function related to the regulation of glucose metabolism. , 1996, Progress in brain research.

[2]  O. Grüsser,et al.  Vestibular neurones in the parieto‐insular cortex of monkeys (Macaca fascicularis): visual and neck receptor responses. , 1990, The Journal of physiology.

[3]  J. Saunders,et al.  Do humans optimally integrate stereo and texture information for judgments of surface slant? , 2003, Vision Research.

[4]  A. Yuille,et al.  Bayesian decision theory and psychophysics , 1996 .

[5]  Wendy J. Adams,et al.  Adaptation to three-dimensional distortions in human vision , 2001, Nature Neuroscience.

[6]  Ulrik R. Beierholm,et al.  Causal inference in perception , 2010, Trends in Cognitive Sciences.

[7]  V. Henn,et al.  Neuronal activity in the vestibular nuclei of the alert monkey during vestibular and optokinetic stimulation , 1977, Experimental Brain Research.

[8]  M. Nardini,et al.  Fusion of visual cues is not mandatory in children , 2010, Proceedings of the National Academy of Sciences.

[9]  G. DeAngelis,et al.  Neural correlates of multisensory cue integration in macaque MSTd , 2008, Nature Neuroscience.

[10]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[11]  D. Knill,et al.  The Bayesian brain: the role of uncertainty in neural coding and computation , 2004, Trends in Neurosciences.

[12]  J. Dichgans,et al.  Visual input improves the speedometer function of the vestibular nuclei in the goldfish , 1973, Experimental Brain Research.

[13]  U Büttner,et al.  CIRCULARVECTION: PSYCHOPHYSICS AND SINGLE‐UNIT RECORDINGS IN THE MONKEY * , 1981, Annals of the New York Academy of Sciences.

[14]  Robert J. van Beers,et al.  How humans combine simultaneous proprioceptive and visual position information , 1996, Experimental Brain Research.

[15]  C. Duffy,et al.  Heading representation in MST: sensory interactions and population encoding. , 2003, Journal of neurophysiology.

[16]  O. Blanke,et al.  Multisensory Mechanisms in Temporo-Parietal Cortex Support Self-Location and First-Person Perspective , 2011, Neuron.

[17]  B. Wandell Foundations of vision , 1995 .

[18]  Jean Laurens,et al.  Bayesian processing of vestibular information , 2007, Biological Cybernetics.

[19]  Jennifer L. Campos,et al.  Bayesian integration of visual and vestibular signals for heading. , 2009, Journal of vision.

[20]  J A SWETS,et al.  Is there a sensory threshold? , 1961, Science.

[21]  Peter Thier,et al.  Optimizing Visual Motion Perception during Eye Movements , 2001, Neuron.

[22]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[23]  Kikuro Fukushima,et al.  Corticovestibular interactions: anatomy, electrophysiology, and functional considerations , 1997, Experimental Brain Research.

[24]  David R. Wozny,et al.  Recalibration of Auditory Space following Milliseconds of Cross-Modal Discrepancy , 2011, The Journal of Neuroscience.

[25]  V. Henn,et al.  Visual-vestibular interaction in the flocculus of the alert monkey , 2004, Experimental Brain Research.

[26]  L. Young,et al.  Vestibular nucleus units in alert monkeys are also influenced by moving visual fields. , 1974, Brain research.

[27]  Dora E Angelaki,et al.  Responses of ventral posterior thalamus neurons to three-dimensional vestibular and optic flow stimulation. , 2010, Journal of neurophysiology.

[28]  Frank Bremmer,et al.  Interaction of linear vestibular and visual stimulation in the macaque ventral intraparietal area (VIP) , 2002, The European journal of neuroscience.

[29]  M. Landy,et al.  Measurement and modeling of depth cue combination: in defense of weak fusion , 1995, Vision Research.

[30]  Robert A Jacobs,et al.  Bayesian integration of visual and auditory signals for spatial localization. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[31]  W Richards,et al.  Quantifying sensory channels: generalizing colorimetry to orientation and texture, touch, and tones. , 1979, Sensory processes.

[32]  James A. Crowell,et al.  The perception of heading during eye movements , 1992, Nature.

[33]  Anne C. Sittig,et al.  Localization of a seen finger is based exclusively on proprioception and on vision of the finger , 1999, Experimental Brain Research.

[34]  Guldin Wo,et al.  Is there a vestibular cortex , 1998 .

[35]  M. Ernst,et al.  When correlation implies causation in multisensory integration , 2012 .

[36]  James M. Hillis,et al.  Slant from texture and disparity cues: optimal cue combination. , 2004, Journal of vision.

[37]  U. W. Buettner,et al.  Parietal cortex (2v) neuronal activity in the alert monkey during natural vestibular and optokinetic stimulation , 1978, Brain Research.

[38]  V. Henn,et al.  Thalamic unit activity in the alert monkey during natural vestibular stimulation , 1976, Brain Research.

[39]  V. Henn,et al.  The velocity response of vestibular nucleus neurons during vestibular, visual, and combined angular acceleration , 1979, Experimental Brain Research.

[40]  O. Grüsser,et al.  Is there a vestibular cortex? , 1998, Trends in Neurosciences.

[41]  O. Grüsser,et al.  Responses of Single Neurons in the Parietoinsular Vestibular Cortex of Primates a , 1988, Annals of the New York Academy of Sciences.

[42]  D Straumann,et al.  Velocity storage contribution to vestibular self-motion perception in healthy human subjects. , 2011, Journal of neurophysiology.

[43]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[44]  Heinrich H. Bülthoff,et al.  The role of attention on the integration of visual and inertial cues , 2009, Experimental Brain Research.

[45]  C. Duffy MST neurons respond to optic flow and translational movement. , 1998, Journal of neurophysiology.

[46]  Eero P. Simoncelli,et al.  Metamers of the ventral stream , 2011, Nature Neuroscience.

[47]  Hannah J. Block,et al.  Sensory weighting and realignment: independent compensatory processes. , 2011, Journal of neurophysiology.

[48]  Heinrich H. Bülthoff,et al.  A Bayesian model of the disambiguation of gravitoinertial force by visual cues , 2007, Experimental Brain Research.

[49]  James M. Hillis,et al.  Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses , 2002, Science.

[50]  Joan López-Moliner,et al.  The benefit of multisensory integration with biological motion signals , 2011, Experimental Brain Research.

[51]  L R Young,et al.  Interaction of optokinetic and vestibular stimuli in motion perception. , 1973, Acta oto-laryngologica.

[52]  G. DeAngelis,et al.  Multimodal Coding of Three-Dimensional Rotation and Translation in Area MSTd: Comparison of Visual and Vestibular Selectivity , 2007, The Journal of Neuroscience.

[53]  Peter Thier,et al.  False perception of motion in a patient who cannot compensate for eye movements , 1997, Nature.

[54]  Christopher R Fetsch,et al.  Dynamic Reweighting of Visual and Vestibular Cues during Self-Motion Perception , 2009, The Journal of Neuroscience.

[55]  Christian Darlot,et al.  Using sensory weighting to model the influence of canal, otolith and visual cues on spatial orientation and eye movements , 2002, Biological Cybernetics.

[56]  F. Scharnowski,et al.  Long-lasting modulation of feature integration by transcranial magnetic stimulation. , 2009, Journal of vision.

[57]  François Klam,et al.  ã Federation of European Neuroscience Societies Visual±vestibular interactive responses in the macaque ventral intraparietal area (VIP) , 2022 .