Affect bursts to constrain the meaning of the facial expressions of the humanoid robot Zeno

When a robot is used in an intervention for autistic children to learn emotional skills, it is particularly important that the robot's facial expressions of emotion are well recognised. However, recognising what emotion a robot is expressing, based solely on the robot's facial expressions, can be difficult. To improve the recognition rates, we added affect bursts to a set of caricatured and more humanlike facial expressions, using Robokind's R25 Zeno robot. Twenty-eight typically developing children participated in this study. We found no significant difference between the two sets of facial expressions. However, the addition of affect bursts significantly improved the recognition rates of the emotions by helping constrain the meaning of facial expression.

[1]  U. Frith,et al.  The Weak Coherence Account: Detail-focused Cognitive Style in Autism Spectrum Disorders , 2006, Journal of autism and developmental disorders.

[2]  Marc Schröder,et al.  Experimental study of affect bursts , 2003, Speech Commun..

[3]  Adriana Tapus,et al.  Impact of sensory preferences of individuals with autism on the recognition of emotions expressed by two robots, an avatar, and a human , 2016, Autonomous Robots.

[4]  Shlomo Bentin,et al.  Inherently Ambiguous: Facial Expressions of Emotions, in Context , 2013 .

[5]  Cristina P. Santos,et al.  Facial Expressions and Gestures to Convey Emotions with a Humanoid Robot , 2013, ICSR.

[6]  P. Ekman,et al.  Facial action coding system , 2019 .

[7]  Diane M. Badzinski,et al.  Children's integration of facial and situational cues to emotion. , 1989, Child development.

[8]  N. Borgers,et al.  Children as Respondents in Survey Research: Cognitive Development and Response Quality 1 , 2000 .

[9]  S. Denham,et al.  Affective Social Competence. , 2001 .

[10]  M. Lassonde,et al.  Reduced multisensory facilitation in persons with autism , 2013, Cortex.

[11]  F. Gosselin,et al.  The Montreal Affective Voices: A validated set of nonverbal affect bursts for research on auditory affective processing , 2008, Behavior research methods.

[12]  F. Castelli,et al.  Understanding emotions from standardized facial expressions in autism and normal development , 2005, Autism : the international journal of research and practice.

[13]  Austen Rainer,et al.  Robot-Mediated Interviews - How Effective Is a Humanoid Robot as a Tool for Interviewing Young Children? , 2013, PloS one.

[14]  L. Bahrick,et al.  The development of infant discrimination of affect in multimodal and unimodal stimulation: The role of intersensory redundancy. , 2007, Developmental psychology.

[15]  Celestina Barbosa-Leiker,et al.  Does facial expression recognition provide a toehold for the development of emotion understanding? , 2016, Developmental psychology.

[16]  Kristen A. Lindquist,et al.  Opinion TRENDS in Cognitive Sciences Vol.11 No.8 Cognitive-emotional interactions Language as context for the , 2022 .

[17]  C. Crowell,et al.  The Clinical Use of Robots for Individuals with Autism Spectrum Disorders: A Critical Review. , 2012, Research in autism spectrum disorders.

[18]  Mohammad H. Mahoor,et al.  An emotion recognition comparative study of autistic and typically-developing children using the zeno robot , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[19]  B. Mesquita,et al.  Context in Emotion Perception , 2011 .

[20]  Antonia Hamilton,et al.  Recognition of Emotions in Autism: A Formal Meta-Analysis , 2013, Journal of autism and developmental disorders.

[21]  Klaus R. Scherer,et al.  Vocal communication of emotion: A review of research paradigms , 2003, Speech Commun..

[22]  Andrew J. Calder,et al.  PII: S0042-6989(01)00002-5 , 2001 .

[23]  Laurence Chaby,et al.  A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders , 2015, Front. Psychol..