Timing of audiovisual inputs to the prefrontal cortex and multisensory integration

A number of studies have demonstrated that the relative timing of audiovisual stimuli is especially important for multisensory integration of speech signals although the neuronal mechanisms underlying this complex behavior are unknown. Temporal coincidence and congruency are thought to underlie the successful merging of two intermodal stimuli into a coherent perceptual representation. It has been previously shown that single neurons in the non-human primate prefrontal cortex integrate face and vocalization information. However, these multisensory responses and the degree to which they depend on temporal coincidence have yet to be determined. In this study we analyzed the response latency of ventrolateral prefrontal (VLPFC) neurons to face, vocalization and combined face-vocalization stimuli and an offset (asynchronous) version of the face-vocalization stimulus. Our results indicate that for most prefrontal multisensory neurons, the response latency for the vocalization was the shortest, followed by the combined face-vocalization stimuli. The face stimulus had the longest onset response latency. When tested with a dynamic face-vocalization stimulus that had been temporally offset (asynchronous) one-third of multisensory cells in VLPFC demonstrated a change in response compared to the response to the natural, synchronous face-vocalization movie. Our results indicate that prefrontal neurons are sensitive to the temporal properties of audiovisual stimuli. A disruption in the temporal synchrony of an audiovisual signal which results in a change in the firing of communication related prefrontal neurons could underlie the loss in intelligibility which occurs with asynchronous speech stimuli.

[1]  Joost X. Maier,et al.  Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory Cortex , 2005 .

[2]  P. Marler,et al.  Food-associated calls in rhesus macaques (Macaca mulatta): I. Socioecological factors , 1993 .

[3]  B. Stein,et al.  The Merging of the Senses , 1993 .

[4]  P. Goldman-Rakic,et al.  Auditory belt and parabelt projections to the prefrontal cortex in the Rhesus monkey , 1999, The Journal of comparative neurology.

[5]  Hugo D. Critchley,et al.  Face-selective and auditory neurons in the primate orbitofrontal cortex , 2006, Experimental Brain Research.

[6]  Daniel Senkowski,et al.  Good times for multisensory integration: Effects of the precision of temporal synchrony as revealed by gamma-band oscillations , 2007, Neuropsychologia.

[7]  C. Schroeder,et al.  The Leading Sense: Supramodal Control of Neurophysiological Context by Attention , 2009, Neuron.

[8]  M. Mishkin,et al.  Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex , 1999, Nature Neuroscience.

[9]  D W Massaro,et al.  Perception of asynchronous and conflicting visual and auditory speech. , 1996, The Journal of the Acoustical Society of America.

[10]  Rainer Goebel,et al.  The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. , 2006, Cerebral cortex.

[11]  P. Bertelson,et al.  Cross-modal bias and perceptual fusion with auditory-visual spatial discordance , 1981, Perception & psychophysics.

[12]  P. Goldman-Rakic,et al.  Areal segregation of face-processing neurons in prefrontal cortex. , 1997, Science.

[13]  Katalin M. Gothard,et al.  How do rhesus monkeys (Macaca mulatta) scan faces in a visual paired comparison task? , 2004, Animal Cognition.

[14]  Bruno B Averbeck,et al.  Neural representation of vocalizations in the primate ventrolateral prefrontal cortex. , 2005, Journal of neurophysiology.

[15]  S. Partan SINGLE AND MULTICHANNEL SIGNAL COMPOSITION: FACIAL EXPRESSIONS AND VOCALIZATIONS OF RHESUS MACAQUES (MACACA MULATTA) , 2002 .

[16]  G. Aschersleben,et al.  Temporal ventriloquism: crossmodal interaction on the time dimension. 1. Evidence from auditory-visual temporal order judgment. , 2003, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[17]  Lizabeth M Romanski,et al.  Representation and integration of auditory and visual stimuli in the primate ventral lateral prefrontal cortex. , 2007, Cerebral cortex.

[18]  M. Hallett,et al.  Neural Correlates of Auditory–Visual Stimulus Onset Asynchrony Detection , 2001, The Journal of Neuroscience.

[19]  P. Goldman-Rakic,et al.  Myelo‐ and cytoarchitecture of the granular frontal cortex and surrounding regions in the strepsirhine primate Galago and the anthropoid primate Macaca , 1991, The Journal of comparative neurology.

[20]  Joonyeol Lee,et al.  Spatial Attention and the Latency of Neuronal Responses in Macaque Area V4 , 2007, The Journal of Neuroscience.

[21]  J. Driver Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading , 1996, Nature.

[22]  Yale E Cohen,et al.  Coding of auditory-stimulus identity in the auditory non-spatial processing stream. , 2008, Journal of neurophysiology.

[23]  Bruno B Averbeck,et al.  Integration of Auditory and Visual Communication Information in the Primate Ventrolateral Prefrontal Cortex , 2006, The Journal of Neuroscience.

[24]  C. Legéndy,et al.  Bursts and recurrences of bursts in the spike trains of spontaneously active striate cortex neurons. , 1985, Journal of neurophysiology.

[25]  John J. Foxe,et al.  Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.

[26]  P. Gribble,et al.  Temporal constraints on the McGurk effect , 1996, Perception & psychophysics.

[27]  P. Goldman-Rakic,et al.  An auditory domain in primate prefrontal cortex , 2002, Nature Neuroscience.

[28]  C. Schroeder,et al.  Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.

[29]  D. Pandya,et al.  Comparative cytoarchitectonic analysis of the human and the macaque ventrolateral prefrontal cortex and corticocortical connection patterns in the monkey , 2002, The European journal of neuroscience.

[30]  Lee M. Miller,et al.  Behavioral/systems/cognitive Perceptual Fusion and Stimulus Coincidence in the Cross- Modal Integration of Speech , 2022 .

[31]  Chris I. Baker,et al.  Integration of Visual and Auditory Information by Superior Temporal Sulcus Neurons Responsive to the Sight of Actions , 2005, Journal of Cognitive Neuroscience.

[32]  A. Puce,et al.  Neuronal oscillations and visual amplification of speech , 2008, Trends in Cognitive Sciences.

[33]  N. F. Dixon,et al.  The Detection of Auditory Visual Desynchrony , 1980, Perception.

[34]  Gregor Thut,et al.  Auditory–Visual Multisensory Interactions in Humans: Timing, Topography, Directionality, and Sources , 2010, The Journal of Neuroscience.

[35]  E. Rolls,et al.  Selectivity between faces in the responses of a population of neurons in the cortex in the superior temporal sulcus of the monkey , 1985, Brain Research.

[36]  Stefano Panzeri,et al.  Visual Enhancement of the Information Representation in Auditory Cortex , 2010, Current Biology.

[37]  P. Bertelson,et al.  Multisensory integration, perception and ecological validity , 2003, Trends in Cognitive Sciences.

[38]  C. Schroeder,et al.  Neuronal mechanisms, response dynamics and perceptual functions of multisensory interactions in auditory cortex , 2009, Hearing Research.

[39]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[40]  Doris Y. Tsao,et al.  Patches of face-selective cortex in the macaque frontal lobe , 2008, Nature Neuroscience.

[41]  David B. Pisoni,et al.  Neural processing of asynchronous audiovisual speech perception , 2010, NeuroImage.

[42]  T. Stanford,et al.  Multisensory Integration Shortens Physiological Response Latencies , 2007, The Journal of Neuroscience.

[43]  A John Van Opstal,et al.  Crossmodal integration in the primate superior colliculus underlying the preparation and initiation of saccadic eye movements. , 2005, Journal of neurophysiology.

[44]  B. Stein,et al.  Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors , 1987, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[45]  C. Spence,et al.  Crossmodal binding: Evaluating the “unity assumption” using audiovisual speech stimuli , 2007, Perception & psychophysics.

[46]  Jeffrey D. Schall,et al.  Relationship of presaccadic activity in frontal eye field and supplementary eye field to saccade initiation in macaque: Poisson spike train analysis , 2004, Experimental Brain Research.

[47]  D. Poeppel,et al.  Temporal window of integration in auditory-visual speech perception , 2007, Neuropsychologia.

[48]  L. Romanski,et al.  Neurons responsive to face-view in the primate ventrolateral prefrontal cortex , 2011, Neuroscience.

[49]  P. Marler,et al.  Rhesus monkey (Macaca mulatta) screams: Representational signalling in the recruitment of agonistic aid , 1984, Animal Behaviour.

[50]  E. Bullmore,et al.  Society for Neuroscience Abstracts , 1997 .

[51]  Asif A Ghazanfar,et al.  Facilitation of multisensory integration by the "unity effect" reveals that speech is special. , 2008, Journal of vision.

[52]  E Macaluso,et al.  Spatial and temporal factors during processing of audiovisual speech: a PET study , 2004, NeuroImage.

[53]  Joaquin M. Fuster,et al.  Single cell activity in ventral prefrontal cortex of behaving monkeys , 1981, Brain Research.

[54]  P S Goldman-Rakic,et al.  Face-selective neurons during passive viewing and working memory performance of rhesus monkeys: evidence for intrinsic specialization of neuronal coding. , 1999, Cerebral cortex.

[55]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.