Building a game scenario to encourage children with autism to recognize and label emotions using a humanoid robot

This paper presents an exploratory study in which children with autism interact with ZECA (Zeno Engaging Children with Autism). ZECA is a humanoid robot with a face covered with a material allowing the display of varied facial expressions. The study investigates a novel scenario for robot-assisted play, to help promoting labelling of emotions by children with autism spectrum disorders (ASD). The study was performed during three sessions with two boys diagnosed with ASD. The results obtained from the analysis of the children's behaviours while interacting with ZECA helped us improve several aspects of our game scenario such as the technical specificities of the game and its dynamics, and the experimental setup. The software produced for this study allows the robot to autonomously identify the answers of the child during the session. This automatic identification helped the fluidity of the game and freed the experimenter to participate in triadic interactions with the child. The evaluation of the game scenario that will be used in a future study was the main goal of this pilot study, rather than to quantify and evaluate the performance of the children. Overall, this exploratory study in teaching children about labelling emotions using a humanoid robot embedded in a game scenario demonstrated the possible positive outcomes this child-robot interaction can produce and highlighted the issues regarding data collection and their analysis that will inform future studies.

[1]  Bram Vanderborght,et al.  Expressing Emotions with the Social Robot Probo , 2010, Int. J. Soc. Robotics.

[2]  David Hanson,et al.  Zeno: A cognitive character , 2008, AAAI 2008.

[3]  G. Arbanas Diagnostic and Statistical Manual of Mental Disorders (DSM-5) , 2015 .

[4]  Cristina P. Santos,et al.  Facial Expressions and Gestures to Convey Emotions with a Humanoid Robot , 2013, ICSR.

[5]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[6]  S Baron-Cohen,et al.  The development of a theory of mind in autism: deviance and delay? , 1991, The Psychiatric clinics of North America.

[7]  J. Brian,et al.  Behavioral manifestations of autism in the first year of life , 2005, International Journal of Developmental Neuroscience.

[8]  Tony Charman,et al.  Social and Communication Development in Autism Spectrum Disorders: Early Identification, Diagnosis, and Intervention , 2013 .

[9]  Kerstin Dautenhahn,et al.  ”Where is Your Nose?” - Developing Body Awareness Skills Among Children With Autism Using a Humanoid Robot , 2013, ACHI 2013.

[10]  Maja J. Matarić,et al.  B3IA: A control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[11]  D. Feil-Seifer,et al.  Defining socially assistive robotics , 2005, 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005..

[12]  Antonio Cisternino,et al.  The FACE of autism , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[13]  Lucas P. J. J. Noldus,et al.  The Observer: A software system for collection and analysis of observational data , 1991 .

[14]  Piotr Winkielman,et al.  Autism and the extraction of emotion from briefly presented facial expressions: stumbling at the first step of empathy. , 2008, Emotion.

[15]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[16]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).