An ERP study on whether semantic integration exists in processing ecologically unrelated audio–visual information

In the present study, we used event-related potentials (ERPs) to examine whether semantic integration occurs for ecologically unrelated audio-visual information. Videos with synchronous audio-visual information were used as stimuli, where the auditory stimuli were sine wave sounds with different sound levels, and the visual stimuli were simple geometric figures with different areas. In the experiment, participants were shown an initial display containing a single shape (drawn from a set of 6 shapes) with a fixed size (14cm(2)) simultaneously with a 3500Hz tone of a fixed intensity (80dB). Following a short delay, another shape/tone pair was presented and the relationship between the size of the shape and the intensity of the tone varied across trials: in the V+A- condition, a large shape was paired with a soft tone; in the V+A+ condition, a large shape was paired with a loud tone, and so forth. The ERPs results revealed that N400 effect was elicited under the VA- condition (V+A- and V-A+) as compared to the VA+ condition (V+A+ and V-A-). It was shown that semantic integration would occur when simultaneous, ecologically unrelated auditory and visual stimuli enter the human brain. We considered that this semantic integration was based on semantic constraint of audio-visual information, which might come from the long-term learned association stored in the human brain and short-term experience of incoming information.

[1]  T. Omori,et al.  Altered effect of preceding response execution on inhibitory processing in children with AD/HD: An ERP study. , 2010, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[2]  J. Townsend,et al.  A developmental ERP study of verbal and non-verbal semantic processing , 2008, Brain Research.

[3]  Bin Liu,et al.  Semantic association of ecologically unrelated synchronous audio-visual information in cognitive integration: an event-related potential study , 2011, Neuroscience.

[4]  Zhixing Jin,et al.  The effects of punctuations in Chinese sentence comprehension: An ERP study , 2010, Journal of Neurolinguistics.

[5]  B. Liu,et al.  Cognitive integration of asynchronous natural or non-natural auditory and visual information in videos of real-world events: an event-related potential study , 2011, Neuroscience.

[6]  Guangning Wu,et al.  Cognitive processing of traffic signs in immersive virtual reality environment: An ERP study , 2010, Neuroscience Letters.

[7]  Aina Puce,et al.  Neural responses elicited to face motion and vocalization pairings , 2007, Neuropsychologia.

[8]  Zhixing Jin,et al.  The processing of phonological, orthographical, and lexical information of Chinese characters in sentence contexts: An ERP study , 2011, Brain Research.

[9]  Z. Wang,et al.  The influence of matching degrees of synchronous auditory and visual information in videos of real-world events on cognitive integration: an event-related potential study , 2011, Neuroscience.

[10]  Z. Wang,et al.  The influence of temporal asynchrony on multisensory integration in the processing of asynchronous audio-visual stimuli of real-world events: an event-related potential study , 2011, Neuroscience.

[11]  Vincent M. Reid,et al.  N400 involvement in the processing of action sequences , 2008, Neuroscience Letters.

[12]  Zhixing Jin,et al.  Chinese function words grammaticalized from content words: Evidence from ERPs , 2010, Journal of Neurolinguistics.

[13]  Wei Li,et al.  Perspective taking modulates event-related potentials to perceived pain , 2010, Neuroscience Letters.

[14]  Zhixing Jin,et al.  The integration processing of the visual and auditory information in videos of real-world events: An ERP study , 2009, Neuroscience Letters.

[15]  J. Zacks Neuroimaging Studies of Mental Rotation: A Meta-analysis and Review , 2008, Journal of Cognitive Neuroscience.

[16]  Baolin Liu,et al.  An ERP study on whether the P600 can reflect the presence of unexpected phonology , 2011, Experimental Brain Research.

[17]  Phillip J. Holcomb,et al.  Two Neurocognitive Mechanisms of Semantic Integration during the Comprehension of Visual Real-world Events , 2008, Journal of Cognitive Neuroscience.

[18]  Zhixing Jin,et al.  The interaction between pictures and words: evidence from positivity offset and negativity bias , 2010, Experimental Brain Research.

[19]  Zhixing Jin,et al.  The influence on cognitive processing from the switches of shooting angles in videos of real-world events: An ERP study , 2010, Neuroscience Letters.

[20]  Zhixing Jin,et al.  The pragmatic meanings conveyed by function words in Chinese sentences: An ERP study , 2009, Journal of Neurolinguistics.

[21]  M. Kutas,et al.  Reading senseless sentences: brain potentials reflect semantic incongruity. , 1980, Science.

[22]  Kara D. Federmeier,et al.  Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP). , 2011, Annual review of psychology.

[23]  Baolin Liu,et al.  Semantic integration of differently asynchronous audio–visual information in videos of real-world events in cognitive processing: An ERP study , 2011, Neuroscience Letters.

[24]  Zhixing Jin,et al.  Emotional facilitation effect in the picture–word interference task: An ERP study , 2010, Brain and Cognition.