A dynamical framework to relate perceptual variability with multisensory information processing

Multisensory processing involves participation of individual sensory streams, e.g., vision, audition to facilitate perception of environmental stimuli. An experimental realization of the underlying complexity is captured by the “McGurk-effect”- incongruent auditory and visual vocalization stimuli eliciting perception of illusory speech sounds. Further studies have established that time-delay between onset of auditory and visual signals (AV lag) and perturbations in the unisensory streams are key variables that modulate perception. However, as of now only few quantitative theoretical frameworks have been proposed to understand the interplay among these psychophysical variables or the neural systems level interactions that govern perceptual variability. Here, we propose a dynamic systems model consisting of the basic ingredients of any multisensory processing, two unisensory and one multisensory sub-system (nodes) as reported by several researchers. The nodes are connected such that biophysically inspired coupling parameters and time delays become key parameters of this network. We observed that zero AV lag results in maximum synchronization of constituent nodes and the degree of synchronization decreases when we have non-zero lags. The attractor states of this network can thus be interpreted as the facilitator for stabilizing specific perceptual experience. Thereby, the dynamic model presents a quantitative framework for understanding multisensory information processing.

[1]  Ophelia Deroy,et al.  Multisensory constraints on awareness , 2014, Philosophical Transactions of the Royal Society B: Biological Sciences.

[2]  C. Spence,et al.  The Handbook of Multisensory Processing , 2004 .

[3]  Benjamin A. Rowland,et al.  Hebbian mechanisms help explain development of multisensory integration in the superior colliculus: a neural network model , 2012, Biological Cybernetics.

[4]  B. Ross,et al.  Evidence for training-induced crossmodal reorganization of cortical functions in trumpet players , 2003, Neuroreport.

[5]  J A Kelso,et al.  The nonlinear dynamics of speech categorization. , 1994, Journal of experimental psychology. Human perception and performance.

[6]  R. Spigler,et al.  The Kuramoto model: A simple paradigm for synchronization phenomena , 2005 .

[7]  Pascal Belin,et al.  Crossmodal Adaptation in Right Posterior Superior Temporal Sulcus during Face–Voice Emotional Integration , 2014, The Journal of Neuroscience.

[8]  Audrey R. Nath,et al.  Dynamic Changes in Superior Temporal Sulcus Connectivity during Perception of Noisy Audiovisual Speech , 2011, The Journal of Neuroscience.

[9]  John J. Foxe,et al.  Crossmodal binding through neural coherence: implications for multisensory processing , 2008, Trends in Neurosciences.

[10]  Emiliano Macaluso,et al.  Spatial Constraints in Multisensory Attention , 2012 .

[11]  M HERSHENSON,et al.  Reaction time as a measure of intersensory facilitation. , 1962, Journal of experimental psychology.

[12]  N. Weisz,et al.  Prestimulus beta power and phase synchrony influence the sound-induced flash illusion. , 2014, Cerebral cortex.

[13]  J. Haxby,et al.  Functional Magnetic Resonance Imaging of Human Visual Cortex during Face Matching: A Comparison with Positron Emission Tomography , 1996, NeuroImage.

[14]  Gregory Hickok,et al.  An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex , 2013, PloS one.

[15]  Viktor K. Jirsa,et al.  Multisensory integration for timing engages different brain networks , 2007, NeuroImage.

[16]  Anne-Lise Giraud,et al.  Prediction across sensory modalities: A neurocomputational model of the McGurk effect , 2015, Cortex.

[17]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[18]  H Petsche,et al.  Synchronization between temporal and parietal cortex during multimodal object processing in man. , 1999, Cerebral cortex.

[19]  C. K. Peck,et al.  Visual-auditory integration in cat superior colliculus: implications for neuronal control of the orienting response. , 1996, Progress in brain research.

[20]  Terrence R Stanford,et al.  A Model of the Neural Mechanisms Underlying Multisensory Integration in the Superior Colliculus , 2007, Perception.

[21]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[22]  Jeffery A. Jones,et al.  Neural processes underlying perceptual enhancement by visual speech gestures , 2003, Neuroreport.

[23]  Lea Fleischer,et al.  The Senses Considered As Perceptual Systems , 2016 .

[24]  Justine Cléry,et al.  Whole brain mapping of visual and tactile convergence in the macaque monkey , 2015, NeuroImage.

[25]  C. Schroeder,et al.  Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.

[26]  Gustavo Deco,et al.  Role of local network oscillations in resting-state functional connectivity , 2011, NeuroImage.

[27]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[28]  Albert R. Powers,et al.  Neural Correlates of Multisensory Perceptual Learning , 2012, The Journal of Neuroscience.

[29]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[30]  Mauro Ursino,et al.  Neurocomputational approaches to modelling multisensory integration in the brain: A review , 2014, Neural Networks.

[31]  Mauro Ursino,et al.  A theoretical study of multisensory integration in the superior colliculus by a neural network model , 2008, Neural Networks.

[32]  P. Bertelson,et al.  Multisensory integration, perception and ecological validity , 2003, Trends in Cognitive Sciences.

[33]  T. Stanford,et al.  Nonvisual influences on visual-information processing in the superior colliculus. , 2001, Progress in brain research.

[34]  A. Puce,et al.  Neuronal oscillations and visual amplification of speech , 2008, Trends in Cognitive Sciences.

[35]  Yoshinao Kajikawa,et al.  Cortical connections of the auditory cortex in marmoset monkeys: Core and medial belt regions , 2006, The Journal of comparative neurology.

[36]  David Poeppel,et al.  Cortical Oscillations in Auditory Perception and Speech: Evidence for Two Temporal Windows in Human Auditory Cortex , 2012, Front. Psychology.

[37]  Michael Breakspear,et al.  A Canonical Model of Multistability and Scale-Invariance in Biological Systems , 2012, PLoS Comput. Biol..

[38]  Nadia Bolognini,et al.  A neurocomputational analysis of the sound-induced flash illusion , 2014, NeuroImage.

[39]  Gregor Thut,et al.  Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility , 2016, eLife.

[40]  Rodrigo Quian Quiroga,et al.  Spatio-temporal frequency characteristics of intersensory components in audiovisually evoked potentials. , 2005, Brain research. Cognitive brain research.

[41]  Andreas K. Engel,et al.  Oscillatory Synchronization in Large-Scale Cortical Networks Predicts Perception , 2011, Neuron.

[42]  David Poeppel,et al.  Cortical oscillations and speech processing: emerging computational principles and operations , 2012, Nature Neuroscience.

[43]  Mauro Ursino,et al.  A Neural Network Model of Ventriloquism Effect and Aftereffect , 2012, PloS one.

[44]  H. Kennedy,et al.  Anatomical Evidence of Multimodal Integration in Primate Striate Cortex , 2002, The Journal of Neuroscience.

[45]  J. Schwartz,et al.  A possible neurophysiological correlate of audiovisual binding and unbinding in speech perception , 2014, Front. Psychol..

[46]  Gustavo Deco,et al.  Multi-stable perception balances stability and sensitivity , 2013, Front. Comput. Neurosci..

[47]  Kathleen S Rockland,et al.  Multisensory convergence in calcarine visual areas in macaque monkey. , 2003, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[48]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[49]  Lawrence G. McDade,et al.  Behavioral Indices of Multisensory Integration: Orientation to Visual Cues is Affected by Auditory Stimuli , 1989, Journal of Cognitive Neuroscience.

[50]  Mareike M. Menz,et al.  Multisensory interactions between auditory and haptic object recognition. , 2013, Cerebral cortex.

[51]  A. Macleod,et al.  Quantifying the contribution of vision to speech perception in noise. , 1987, British journal of audiology.

[52]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[53]  N. Weisz,et al.  On the variability of the McGurk effect: audiovisual integration depends on prestimulus brain states. , 2012, Cerebral cortex.

[54]  H. Scheich,et al.  Multisensory processing via early cortical stages: Connections of the primary auditory cortical field with other sensory systems , 2006, Neuroscience.

[55]  T. Hackett,et al.  Multisensory convergence in auditory cortex, I. Cortical connections of the caudal superior temporal plane in macaque monkeys , 2007, The Journal of comparative neurology.

[56]  B. Stein,et al.  Interactions among converging sensory inputs in the superior colliculus. , 1983, Science.

[57]  D W Massaro,et al.  Perception of asynchronous and conflicting visual and auditory speech. , 1996, The Journal of the Acoustical Society of America.

[58]  Lawrence M. Ward,et al.  Asynchrony from synchrony: long-range gamma-band neural synchrony accompanies perception of audiovisual speech asynchrony , 2008, Experimental Brain Research.

[59]  Rainer Goebel,et al.  Interaction of speech and script in human auditory cortex: Insights from neuro-imaging and effective connectivity , 2009, Hearing Research.

[60]  C. Frith,et al.  Modulation of human visual cortex by crossmodal spatial attention. , 2000, Science.

[61]  Yoshiki Kuramoto,et al.  Chemical Oscillations, Waves, and Turbulence , 1984, Springer Series in Synergetics.

[62]  Keiji Tanaka,et al.  Functional Division Among Monkey Prefrontal Areas in Goal-Directed Behavior , 2010 .

[63]  M. Lévesque Perception , 1986, The Yale Journal of Biology and Medicine.

[64]  M. Wallace,et al.  A revised view of sensory cortical parcellation , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[65]  Lisa A. de la Mothe,et al.  Thalamic connections of the auditory cortex in marmoset monkeys: Core and medial belt regions , 2006, The Journal of comparative neurology.

[66]  P. Barone,et al.  Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey , 2005, The European journal of neuroscience.

[67]  Dominic W. Massaro,et al.  Perceiving asynchronous bimodal speech in consonant-vowel and vowel syllables , 1993, Speech Commun..

[68]  Gregor Thut,et al.  Auditory–Visual Multisensory Interactions in Humans: Timing, Topography, Directionality, and Sources , 2010, The Journal of Neuroscience.

[69]  Viktor K. Jirsa,et al.  How do neural connectivity and time delays influence bimanual coordination? , 2007, Biological Cybernetics.

[70]  P. Gribble,et al.  Temporal constraints on the McGurk effect , 1996, Perception & psychophysics.

[71]  L. Harris,et al.  Simultaneity Constancy , 2004, Perception.

[72]  M A Meredith,et al.  Descending efferents from the superior colliculus relay integrated multisensory information. , 1985, Science.

[73]  Mark P. Richardson,et al.  Dynamics on Networks: The Role of Local Dynamics and Global Networks on the Emergence of Hypersynchronous Neural Activity , 2014, PLoS Comput. Biol..

[74]  Andreas Daffertshofer,et al.  Generative Models of Cortical Oscillations: Neurobiological Implications of the Kuramoto Model , 2010, Front. Hum. Neurosci..

[75]  Osamu Hoshino,et al.  Neuronal Responses Below Firing Threshold for Subthreshold Cross-Modal Enhancement , 2011, Neural Computation.

[76]  Mingzhou Ding,et al.  Will a large complex system with time delays be stable? , 2004, Physical review letters.