Semantic integration of differently asynchronous audio–visual information in videos of real-world events in cognitive processing: An ERP study

In the real world, some of the auditory and visual information received by the human brain are temporally asynchronous. How is such information integrated in cognitive processing in the brain? In this paper, we aimed to study the semantic integration of differently asynchronous audio-visual information in cognitive processing using ERP (event-related potential) method. Subjects were presented with videos of real world events, in which the auditory and visual information are temporally asynchronous. When the critical action was prior to the sound, sounds incongruous with the preceding critical actions elicited a N400 effect when compared to congruous condition. This result demonstrates that semantic contextual integration indexed by N400 also applies to cognitive processing of multisensory information. In addition, the N400 effect is early in latency when contrasted with other visually induced N400 studies. It is shown that cross modal information is facilitated in time when contrasted with visual information in isolation. When the sound was prior to the critical action, a larger late positive wave was observed under the incongruous condition compared to congruous condition. P600 might represent a reanalysis process, in which the mismatch between the critical action and the preceding sound was evaluated. It is shown that environmental sound may affect the cognitive processing of a visual event.

[1]  Vincent M. Reid,et al.  N400 involvement in the processing of action sequences , 2008, Neuroscience Letters.

[2]  T. Hackett,et al.  Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing. , 2003, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[3]  Guangning Wu,et al.  Cognitive processing of traffic signs in immersive virtual reality environment: An ERP study , 2010, Neuroscience Letters.

[4]  Zhixing Jin,et al.  The interaction between pictures and words: evidence from positivity offset and negativity bias , 2010, Experimental Brain Research.

[5]  S. Kelly,et al.  Neural correlates of bimodal speech and gesture comprehension , 2004, Brain and Language.

[6]  Andreas K. Engel,et al.  Gamma-band activity reflects multisensory matching in working memory , 2009, Experimental Brain Research.

[7]  Zhixing Jin,et al.  The integration processing of the visual and auditory information in videos of real-world events: An ERP study , 2009, Neuroscience Letters.

[8]  Kara D. Federmeier,et al.  Switching Languages, Switching Palabras (Words): An Electrophysiological Study of Code Switching , 2002, Brain and Language.

[9]  Zhixing Jin,et al.  The pragmatic meanings conveyed by function words in Chinese sentences: An ERP study , 2009, Journal of Neurolinguistics.

[10]  Phillip J. Holcomb,et al.  Two Neurocognitive Mechanisms of Semantic Integration during the Comprehension of Visual Real-world Events , 2008, Journal of Cognitive Neuroscience.

[11]  Zhixing Jin,et al.  The influence on cognitive processing from the switches of shooting angles in videos of real-world events: An ERP study , 2010, Neuroscience Letters.

[12]  Zhixing Jin,et al.  The effects of punctuations in Chinese sentence comprehension: An ERP study , 2010, Journal of Neurolinguistics.

[13]  C. Braun,et al.  Speech rate as a sticky switch: A multiple lesion case analysis of mutism and hyperlalia , 2004, Brain and Language.

[14]  B. Liu,et al.  Cognitive integration of asynchronous natural or non-natural auditory and visual information in videos of real-world events: an event-related potential study , 2011, Neuroscience.

[15]  E Donchin,et al.  A metric for thought: a comparison of P300 latency and reaction time. , 1981, Science.

[16]  Kara D. Federmeier,et al.  Multiple effects of sentential constraint on word processing , 2007, Brain Research.

[17]  Zhixing Jin,et al.  Chinese function words grammaticalized from content words: Evidence from ERPs , 2010, Journal of Neurolinguistics.

[18]  R. Knight,et al.  Neural origins of the P300. , 2000, Critical reviews in neurobiology.

[19]  Ying Choon Wu,et al.  Meaningful gestures: electrophysiological indices of iconic gesture comprehension. , 2005, Psychophysiology.

[20]  Z. Wang,et al.  The influence of temporal asynchrony on multisensory integration in the processing of asynchronous audio-visual stimuli of real-world events: an event-related potential study , 2011, Neuroscience.

[21]  M. Kutas,et al.  Reading between the lines: Event-related brain potentials during natural sentence processing , 1980, Brain and Language.

[22]  Maren Grigutsch,et al.  The neural correlates of infant and adult goal prediction: evidence for semantic processing systems. , 2009, Developmental psychology.

[23]  P. Hagoort,et al.  Exceptions and anomalies: An ERP study on context sensitivity in autism , 2010, Neuropsychologia.

[24]  L. Osterhout,et al.  The independence of combinatory semantic processing: Evidence from event-related potentials , 2005 .

[25]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[26]  Robert Oostenveld,et al.  Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming , 2008, NeuroImage.

[27]  Zhixing Jin,et al.  The processing of phonological, orthographical, and lexical information of Chinese characters in sentence contexts: An ERP study , 2011, Brain Research.

[28]  Zhixing Jin,et al.  Emotional facilitation effect in the picture–word interference task: An ERP study , 2010, Brain and Cognition.