Integration of cross-modal emotional information in the human brain: An fMRI study

The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration.

[1]  Karl J. Friston,et al.  Neuroanatomical correlates of externally and internally generated human emotion. , 1997, The American journal of psychiatry.

[2]  A. Roskies The Binding Problem , 1999, Neuron.

[3]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[4]  D. Massaro,et al.  Perceiving affect from the voice and the face , 1996, Psychonomic bulletin & review.

[5]  G. Ettlinger,et al.  Cross-modal recognition of familiar and unfamiliar objects by the monkey: The effects of ablation of polysensory neocortex or of the amygdaloid complex , 1987, Behavioural Brain Research.

[6]  M. Hallett,et al.  Neural correlates of cross-modal binding , 2003, Nature Neuroscience.

[7]  C. Frith,et al.  Functional imaging of ‘theory of mind’ , 2003, Trends in Cognitive Sciences.

[8]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[9]  M. Mishkin,et al.  Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex , 1999, Nature Neuroscience.

[10]  Takashi Tsukiura,et al.  Dissociable roles of the bilateral anterior temporal lobe in face−name associations: An event-related fMRI study , 2006, NeuroImage.

[11]  C J Price,et al.  The neural systems sustaining face and proper-name processing. , 1998, Brain : a journal of neurology.

[12]  E. Rolls,et al.  The representation of pleasant touch in the brain and its relationship with taste and olfactory areas. , 1999, Neuroreport.

[13]  Multisensory perception of affect, its time course and its neural basis , 2004 .

[14]  T. P. S. Powell,et al.  The organization of the connections between the cortex and the claustrum in the monkey , 1982, Brain Research.

[15]  J. Pardo,et al.  Emotion, olfaction, and the human amygdala: amygdala activation during aversive olfactory stimulation. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[16]  M. Erb,et al.  Effects of prosodic emotional intensity on activation of associative auditory cortex , 2006, Neuroreport.

[17]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[18]  G. Pourtois,et al.  Perception of Facial Expressions and Voices and of their Combination in the Human Brain , 2005, Cortex.

[19]  David N. Kennedy,et al.  fMRI of sensitization to angry faces , 2005, NeuroImage.

[20]  D. Zald,et al.  Anatomy and function of the orbital frontal cortex, II: Function and relevance to obsessive-compulsive disorder. , 1996, The Journal of neuropsychiatry and clinical neurosciences.

[21]  D. Neary,et al.  Knowledge of famous faces and names in semantic dementia. , 2004, Brain : a journal of neurology.

[22]  E. Macaluso,et al.  Multisensory spatial interactions: a window onto functional integration in the human brain , 2005, Trends in Neurosciences.

[23]  N. Costes,et al.  Emotional Responses to Pleasant and Unpleasant Olfactory, Visual, and Auditory Stimuli: a Positron Emission Tomography Study , 2000, The Journal of Neuroscience.

[24]  E. Rolls,et al.  Emotion-related learning in patients with social and emotional changes associated with frontal lobe damage. , 1994, Journal of neurology, neurosurgery, and psychiatry.

[25]  H. Critchley,et al.  Explicit and implicit neural mechanisms for processing of social information from facial expressions: A functional magnetic resonance imaging study , 2000, Human brain mapping.

[26]  K. Scherer,et al.  The voices of wrath: brain responses to angry prosody in meaningless speech , 2005, Nature Neuroscience.

[27]  Gilles Pourtois,et al.  Multisensory perception of emotion, its time course, and its neural basis , 2004 .

[28]  E. Bullmore,et al.  Response amplification in sensory-specific cortices during crossmodal binding. , 1999, Neuroreport.

[29]  James K. Kroger,et al.  Cross-modal and cross-temporal association in neurons of frontal cortex , 2000, Nature.

[30]  G. Calvert,et al.  Multisensory integration: methodological approaches and emerging principles in the human brain , 2004, Journal of Physiology-Paris.

[31]  Jesper Andersson,et al.  Valid conjunction inference with the minimum statistic , 2005, NeuroImage.

[32]  A. Giraud,et al.  Implicit Multisensory Associations Influence Voice Recognition , 2006, PLoS biology.

[33]  Alan C. Evans,et al.  Emotional responses to pleasant and unpleasant music correlate with activity in paralimbic brain regions , 1999, Nature Neuroscience.

[34]  Gilles Pourtois,et al.  Investigating audiovisual integration of emotional signals in the human brain. , 2006, Progress in brain research.

[35]  J. Vroomen,et al.  The perception of emotions by ear and by eye , 2000 .

[36]  M. Raichle,et al.  The Emotional Modulation of Cognitive Processing: An fMRI Study , 2000, Journal of Cognitive Neuroscience.

[37]  Michael Erb,et al.  Audiovisual integration of emotional signals in voice and face: An event-related fMRI study , 2007, NeuroImage.

[38]  T. Johnstone,et al.  The voice of emotion: an FMRI study of neural responses to angry and happy vocal expressions. , 2006, Social cognitive and affective neuroscience.

[39]  Guido Gainotti,et al.  Slowly progressive defect in recognition of familiar people in a patient with right anterior temporal atrophy. , 2003, Brain : a journal of neurology.

[40]  M Mishkin,et al.  Amygdalectomy impairs crossmodal association in monkeys. , 1985, Science.

[41]  H. Ackermann,et al.  Cerebral processing of linguistic and emotional prosody: fMRI studies. , 2006, Progress in brain research.

[42]  Chris I. Baker,et al.  Integration of Visual and Auditory Information by Superior Temporal Sulcus Neurons Responsive to the Sight of Actions , 2005, Journal of Cognitive Neuroscience.

[43]  R. Dolan,et al.  Crossmodal binding of fear in voice and face , 2001, Proceedings of the National Academy of Sciences of the United States of America.