Gender differences in identifying emotions from auditory and visual stimuli

Abstract The present study focused on gender differences in emotion identification from auditory and visual stimuli produced by two male and two female actors. Differences in emotion identification from nonsense samples, language samples and prolonged vowels were investigated. It was also studied whether auditory stimuli can convey the emotional content of speech without visual stimuli, and whether visual stimuli can convey the emotional content of speech without auditory stimuli. The aim was to get a better knowledge of vocal attributes and a more holistic understanding of the nonverbal communication of emotion. Females tended to be more accurate in emotion identification than males. Voice quality parameters played a role in emotion identification in both genders. The emotional content of the samples was best conveyed by nonsense sentences, better than by prolonged vowels or shared native language of the speakers and participants. Thus, vocal non-verbal communication tends to affect the interpretation of emotion even in the absence of language. The emotional stimuli were better recognized from visual stimuli than auditory stimuli by both genders. Visual information about speech may not be connected to the language; instead, it may be based on the human ability to understand the kinetic movements in speech production more readily than the characteristics of the acoustic cues.

[1]  Denis Burnham,et al.  Faciliation of Mandarin tone perception by visual speech in clear and degraded audio: implications for cochlear implants. , 2012, The Journal of the Acoustical Society of America.

[2]  Chetwyn C. H. Chan,et al.  Gender differences in neural correlates of recognition of happy and sad faces in humans assessed by functional magnetic resonance imaging , 2002, Neuroscience Letters.

[3]  H. Flor,et al.  Gender differences in the processing of standardized emotional visual stimuli in humans: a functional magnetic resonance imaging study , 2003, Neuroscience Letters.

[4]  A. Friederici,et al.  Sex differentiates the role of emotional prosody during word processing. , 2002, Brain research. Cognitive brain research.

[5]  P. Ekman,et al.  Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of Emotion (JACFEE): Reliability Data and Cross-National Differences , 1997 .

[6]  D. Wildgruber,et al.  Gender differences in emotion recognition: Impact of sensory modality and emotional category , 2014, Cognition & emotion.

[7]  D. Schön,et al.  Emotional prosody: sex differences in sensitivity to speech melody , 2002, Trends in Cognitive Sciences.

[8]  T. Waaramaa,et al.  Perception of emotionally loaded vocal expressions and its connection to responses to music. A cross-cultural investigation: Estonia, Finland, Sweden, Russia, and the USA , 2013, Front. Psychol..

[9]  A. Liberman On Finding That Speech Is Special , 1982 .

[10]  K. Scherer,et al.  Multimodal expression of emotion: affect programs or componential appraisal patterns? , 2007, Emotion.

[11]  Steven L. Small,et al.  Listening to talking faces: motor cortical activation during speech perception , 2005, NeuroImage.

[12]  K. Amunts,et al.  Gender differences in the cognitive control of emotion: An fMRI study , 2007, Neuropsychologia.

[13]  Teija Waaramaa-Mäki-Kulmala,et al.  Emotions in voice. Acoustic and perceptual analysis of voice quality in the vocal expression of emotions , 2009 .

[14]  K. Scherer,et al.  Acoustic profiles in vocal emotion expression. , 1996, Journal of personality and social psychology.

[15]  Stephan Felber,et al.  Sex differences in brain activation patterns during processing of positively and negatively valenced emotional words , 2006, Psychological Medicine.

[16]  C. Izard Basic Emotions, Natural Kinds, Emotion Schemas, and a New Paradigm , 2007, Perspectives on psychological science : a journal of the Association for Psychological Science.

[17]  Judith A. Hall,et al.  Gender differences in judgments of multiple emotions from facial expressions. , 2004, Emotion.

[18]  G. Pourtois,et al.  Perception of Facial Expressions and Voices and of their Combination in the Human Brain , 2005, Cortex.

[19]  Perception of emotional nonsense sentences in China, Egypt, Estonia, Finland, Russia, Sweden, and the USA , 2015, Logopedics, phoniatrics, vocology.

[20]  Frank Pollick,et al.  Audiovisual integration of emotional signals from others' social interactions , 2015, Front. Psychol..

[21]  F. Lepore,et al.  Women process multisensory emotion expressions more efficiently than men , 2010, Neuropsychologia.

[22]  K. Scherer,et al.  Emotion recognition from expressions in face, voice, and body: the Multimodal Emotion Recognition Test (MERT). , 2009, Emotion.

[23]  M. Herrmann,et al.  Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations , 2009, Brain Research.

[24]  Paul Boersma,et al.  Praat: doing phonetics by computer , 2003 .

[25]  Stephan Felber,et al.  Gender differences in regional cerebral activity during the perception of emotion: A functional MRI study , 2006, NeuroImage.

[26]  Sophie K. Scott,et al.  Cross-cultural recognition of basic emotions through nonverbal emotional vocalizations , 2010, Proceedings of the National Academy of Sciences.

[27]  P. Ekman,et al.  A New Test to Measure Emotion Recognition Ability: Matsumoto and Ekman's Japanese and Caucasian Brief Affect Recognition Test (JACBART) , 2000 .

[28]  R. Hari,et al.  Viewing Lip Forms Cortical Dynamics , 2002, Neuron.

[29]  G. Rizzolatti,et al.  Premotor cortex and the recognition of motor actions. , 1996, Brain research. Cognitive brain research.

[30]  Judith A. Hall Gender Effects in Decoding Nonverbal Cues , 1978 .

[31]  M. Corballis,et al.  From manual gesture to speech: A gradual transition , 2006, Neuroscience & Biobehavioral Reviews.

[32]  Pascal Belin,et al.  Judgment of Emotional Nonlinguistic Vocalizations: Age-Related Differences , 2005, Applied neuropsychology.

[33]  A. Liberman,et al.  The motor theory of speech perception revised , 1985, Cognition.

[34]  Paavo Alku,et al.  Monopitched Expression of Emotions in Different Vowels , 2008, Folia Phoniatrica et Logopaedica.

[35]  S. Kotz,et al.  Sex differentiates the STROOP-effect in emotional speech: ERP evidence , 2002 .