Supramodal Representation of Emotions

Supramodal representation of emotion and its neural substrates have recently attracted attention as a marker of social cognition. However, the question whether perceptual integration of facial and vocal emotions takes place in primary sensory areas, multimodal cortices, or in affective structures remains unanswered yet. Using novel computer-generated stimuli, we combined emotional faces and voices in congruent and incongruent ways and assessed functional brain data (fMRI) during an emotional classification task. Both congruent and incongruent audiovisual stimuli evoked larger responses in thalamus and superior temporal regions compared with unimodal conditions. Congruent emotions were characterized by activation in amygdala, insula, ventral posterior cingulate (vPCC), temporo-occipital, and auditory cortices; incongruent emotions activated a frontoparietal network and bilateral caudate nucleus, indicating a greater processing load in working memory and emotion-encoding areas. The vPCC alone exhibited differential reactions to congruency and incongruency for all emotion categories and can thus be considered a central structure for supramodal representation of complex emotional information. Moreover, the left amygdala reflected supramodal representation of happy stimuli. These findings document that emotional information does not merge at the perceptual audiovisual integration level in unimodal or multimodal areas, but in vPCC and amygdala.

[1]  Frank Schneider,et al.  Incongruence effects in crossmodal emotional integration , 2011, NeuroImage.

[2]  R. Mar The neural bases of social cognition and story comprehension. , 2011, Annual review of psychology.

[3]  K. Mathiak,et al.  Virtual faces as a tool to study emotion recognition deficits in schizophrenia , 2010, Psychiatry Research.

[4]  T. Holroyd,et al.  Neuromagnetic oscillations to emotional faces and prosody , 2010, The European journal of neuroscience.

[5]  Jürgen Dammers,et al.  Early sensory encoding of affective prosody: Neuromagnetic tomography of emotional category changes , 2010, NeuroImage.

[6]  W. Grodd,et al.  Association of trait emotional intelligence and individual fMRI‐activation patterns during the perception of social signals from voice and face , 2009, Human brain mapping.

[7]  Christoph Kayser,et al.  Monkey drumming reveals common networks for perceiving vocal and nonvocal communication sounds , 2009, Proceedings of the National Academy of Sciences.

[8]  R. Schultz,et al.  Superior temporal activation in response to dynamic audio-visual emotional cues , 2009, Brain and Cognition.

[9]  Klaus Mathiak,et al.  Recognition Profile of Emotions in Natural and Virtual Faces , 2008, PloS one.

[10]  Sidney S. Simon,et al.  Merging of the Senses , 2008, Front. Neurosci..

[11]  Robert T. Knight,et al.  Temporal Characteristics of Audiovisual Information Processing , 2008, The Journal of Neuroscience.

[12]  Asif A Ghazanfar,et al.  Interactions between the Superior Temporal Sulcus and Auditory Cortex Mediate Dynamic Face/Voice Integration in Rhesus Monkeys , 2008, The Journal of Neuroscience.

[13]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[14]  Heinrich H. Bülthoff,et al.  Evaluating the perceptual realism of animated facial expressions , 2008, TAP.

[15]  Sterling C. Johnson,et al.  Relevance to self: A brief review and framework of neural systems underlying appraisal , 2007, Neuroscience & Biobehavioral Reviews.

[16]  Michael Erb,et al.  Audiovisual integration of emotional signals in voice and face: An event-related fMRI study , 2007, NeuroImage.

[17]  D. Yurgelun-Todd,et al.  The right-hemisphere and valence hypotheses: could they both be right (and sometimes left)? , 2007, Social cognitive and affective neuroscience.

[18]  Steve Majerus,et al.  Short-term memory and the left intraparietal sulcus: Focus of attention? Further evidence from a face short-term memory paradigm , 2007, NeuroImage.

[19]  K. Mathiak,et al.  Toward brain correlates of natural behavior: fMRI during violent video games , 2006, Human brain mapping.

[20]  Bruno B Averbeck,et al.  Integration of Auditory and Visual Communication Information in the Primate Ventrolateral Prefrontal Cortex , 2006, The Journal of Neuroscience.

[21]  Michael Erb,et al.  Impact of voice on emotional judgment of faces: An event‐related fMRI study , 2006, Human brain mapping.

[22]  Steven Laureys,et al.  Cytology and functionally correlated circuits of human posterior cingulate areas , 2006, NeuroImage.

[23]  Gilles Pourtois,et al.  Investigating audiovisual integration of emotional signals in the human brain. , 2006, Progress in brain research.

[24]  H. Ackermann,et al.  Cerebral processing of linguistic and emotional prosody: fMRI studies. , 2006, Progress in brain research.

[25]  G. Pourtois,et al.  Perception of Facial Expressions and Voices and of their Combination in the Human Brain , 2005, Cortex.

[26]  S. Pollmann,et al.  Shift of activity from attention to motor-related brain areas during visual learning , 2005, Nature Neuroscience.

[27]  B. Vogt Pain and emotion interactions in subregions of the cingulate gyrus , 2005, Nature Reviews Neuroscience.

[28]  Lee M. Miller,et al.  Behavioral/systems/cognitive Perceptual Fusion and Stimulus Coincidence in the Cross- Modal Integration of Speech , 2022 .

[29]  Joost X. Maier,et al.  Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory Cortex , 2005 .

[30]  Jesper Andersson,et al.  Valid conjunction inference with the minimum statistic , 2005, NeuroImage.

[31]  B. Argall,et al.  Unraveling multisensory integration: patchy organization within human STS multisensory cortex , 2004, Nature Neuroscience.

[32]  R. Goebel,et al.  Integration of Letters and Speech Sounds in the Human Brain , 2004, Neuron.

[33]  G. Calvert,et al.  Multisensory integration: methodological approaches and emerging principles in the human brain , 2004, Journal of Physiology-Paris.

[34]  Jonathan D. Cohen,et al.  Anterior Cingulate Conflict Monitoring and Adjustments in Control , 2004, Science.

[35]  A. J. King,et al.  Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus , 2004, Experimental Brain Research.

[36]  A. Friederici,et al.  On the lateralization of emotional prosody: An event-related functional MR investigation , 2003, Brain and Language.

[37]  N. Logothetis,et al.  Neuroperception: Facial expressions linked to monkey calls , 2003, Nature.

[38]  M. Hallett,et al.  Neural correlates of cross-modal binding , 2003, Nature Neuroscience.

[39]  J. Grafman,et al.  The Human Amygdala: An Evolved System for Relevance Detection , 2003, Reviews in the neurosciences.

[40]  Stephan Eliez,et al.  Amygdalar activation associated with positive and negative facial expressions , 2002, Neuroreport.

[41]  Gilles Pourtois,et al.  Facial expressions modulate the time course of long latency auditory brain potentials. , 2002, Brain research. Cognitive brain research.

[42]  K. Luan Phan,et al.  Functional Neuroanatomy of Emotion: A Meta-Analysis of Emotion Activation Studies in PET and fMRI , 2002, NeuroImage.

[43]  R. Dolan,et al.  Crossmodal binding of fear in voice and face , 2001, Proceedings of the National Academy of Sciences of the United States of America.

[44]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[45]  J. Vroomen,et al.  The perception of emotions by ear and by eye , 2000 .

[46]  B. Rossion,et al.  The time‐course of intermodal binding between seeing and hearing affective information , 2000, Neuroreport.

[47]  Jyrki Tuomainen,et al.  The combined perception of emotion from voice and face: early interaction revealed by human electric brain responses , 1999, Neuroscience Letters.

[48]  R. Campbell,et al.  Hearing by eye 2 : advances in the psychology of speechreading and auditory-visual speech , 1997 .

[49]  D. Massaro,et al.  Perceiving affect from the voice and the face , 1996, Psychonomic bulletin & review.

[50]  M T Wallace,et al.  Comparisons of cross-modality integration in midbrain and cortex. , 1996, Progress in brain research.

[51]  M. Torrens Co-Planar Stereotaxic Atlas of the Human Brain—3-Dimensional Proportional System: An Approach to Cerebral Imaging, J. Talairach, P. Tournoux. Georg Thieme Verlag, New York (1988), 122 pp., 130 figs. DM 268 , 1990 .

[52]  W. Krieg Functional Neuroanatomy , 1953, Springer Series in Experimental Entomology.

[53]  E. Ross,et al.  The aprosodias. Functional-anatomic organization of the affective components of language in the right hemisphere. , 1981, Archives of neurology.