Impact of personality on the recognition of emotion expressed via human, virtual, and robotic embodiments

In this paper, we describe the elaboration and the validation of a body and face database1, of 96 videos of 1 to 2 seconds of duration, expressing 4 emotions (i.e., anger, happiness, fear, and sadness) elicited through 4 platforms of increased visual complexity and level of embodiment. The final aim of this database is to develop an individualized training program designed for individuals suffering of autism in order to help them recognize various emotions on different test platforms: two robots, a virtual agent, and a human. Before assessing the recognition capabilities of individuals with ASD, we validated our video database on typically developed individuals (TD). Moreover, we also looked at the relationship between the recognition rate and their personality traits (extroverted (EX) vs. introverted (IN)). We found that the personality of our TD participants did not lead to a different recognition behavior. However, introverted individuals better recognized emotions from less visually complex characters than extroverted individuals.

[1]  Elizabeth S. Kim,et al.  Social Robots as Embedded Reinforcers of Social Behavior in Children with Autism , 2012, Journal of Autism and Developmental Disorders.

[2]  H. Meeren,et al.  Rapid perceptual integration of facial expression and emotional body language. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Andrea Kleinsmith,et al.  Cross-cultural differences in recognizing affect from body posture , 2006, Interact. Comput..

[4]  Mary Hart,et al.  Autism/excel study , 2005, Assets '05.

[5]  Ning Tan,et al.  The Role of Body Postures in the Recognition of Emotions in Contextually Rich Scenarios , 2014, Int. J. Hum. Comput. Interact..

[6]  Cristina P. Santos,et al.  Building a game scenario to encourage children with autism to recognize and label emotions using a humanoid robot , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Céline Clavel,et al.  MARC: a framework that features emotion models for facial animation during human–computer interaction , 2013, Journal on Multimodal User Interfaces.

[8]  H. Wallbott Bodily expression of emotion , 1998 .

[9]  S. Baron-Cohen,et al.  Is There a "Language of the Eyes"? Evidence from Normal Adults, and Adults with Autism or Asperger Syndrome , 1997 .

[10]  Mark C. Coulson Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence , 2004 .

[11]  G. Baird,et al.  Infants with autism: an investigation of empathy, pretend play, joint attention, and imitation. , 1997, Developmental psychology.

[12]  Brian Scassellati,et al.  The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents , 2011, Int. J. Soc. Robotics.

[13]  K. Scherer,et al.  Introducing the Geneva Multimodal Emotion Portrayal (GEMEP) corpus , 2010 .

[14]  Beatrice de Gelder,et al.  The Bodily Expressive Action Stimulus Test (BEAST). Construction and validation of a stimulus basis for measuring perception of whole body expression of emotions , 2011 .

[15]  K. Scherer Expression of emotion in voice and music. , 1995, Journal of voice : official journal of the Voice Foundation.

[16]  Wade Junek,et al.  Mind Reading: The Interactive Guide to Emotions , 2007 .

[17]  S. Srivastava,et al.  The Big Five Trait taxonomy: History, measurement, and theoretical perspectives. , 1999 .

[18]  Mustafa Suphi Erden,et al.  Emotional Postures for the Humanoid-Robot Nao , 2013, Int. J. Soc. Robotics.

[19]  Lawrie S. McKay,et al.  Vision in autism spectrum disorders , 2009, Vision Research.

[20]  J. Burgoon,et al.  Nonverbal Communication , 2018, Encyclopedia of Evolutionary Psychological Science.