Minimalistic Toy Robot Encourages Verbal and Emotional Expressions in Autism

Language offers the possibility to transfer information between speaker and listener who both possess the ability to use it. Using a “speaker-listener” situation, we have compared the verbal and the emotional expressions of neurotypical and autistic children aged 6 to 7 years. The speaker was always a child (neurotypical or autistic); the listener was a human InterActor or an InterActor robot, i.e., a small toy robot that reacts to speech expression by nodding only. The results suggest that a robot characterized by predictable reactions facilitate autistic children in expression. When comparing to the performance of neurotypical children, the data would indicate that minimalistic artificial environments have the potential to open the way for neuronal organization and reorganization with the ability to support the embrainment of verbal and emotional information processing among autistic children. Keywords-brain development; neurotypical children; children with autism; minimalistic robot; language; emotion;

[1]  C. Frith,et al.  Development and neurophysiology of mentalizing. , 2003, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[2]  Tomio Watanabe,et al.  Neurotypical and autistic children aged 6 to 7 years in a speaker-listener situation with a human or a minimalist InterActor robot , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[3]  B. Barres,et al.  Neuronal and glial cell biology , 2000, Current Opinion in Neurobiology.

[4]  Kevin Barraclough,et al.  I and i , 2001, BMJ : British Medical Journal.

[5]  J. Mazziotta,et al.  Mirror neuron system: basic findings and clinical applications , 2007, Annals of neurology.

[6]  Brian Scassellati,et al.  Quantitative metrics of social response for autism diagnosis , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[7]  Steven L. Small,et al.  Neural development of networks for audiovisual speech comprehension , 2010, Brain and Language.

[8]  S. Porges,et al.  Emotion Recognition in Children with Autism Spectrum Disorders: Relations to Eye Gaze and Autonomic State , 2010, Journal of autism and developmental disorders.

[9]  Tom Johnstone,et al.  Amygdala Volume and Nonverbal Social Impairment in Adolescent and Adult Males with Autism , 2022 .

[10]  N. Kanwisher,et al.  New method for fMRI investigations of language: defining ROIs functionally in individual subjects. , 2010, Journal of neurophysiology.

[11]  L. Vygotsky Play and Its Role in the Mental Development of the Child , 1967 .

[12]  S. Baron-Cohen,et al.  Does the autistic child have a “theory of mind” ? , 1985, Cognition.

[13]  Irini Giannopulu,et al.  Multimodal cognitive nonverbal and verbal interactions: The neurorehabilitation of autistic children via mobile toy robots , 2013 .

[14]  Anneli Kylliäinen,et al.  Skin Conductance Responses to Another Person’s Gaze in Children with Autism , 2006, Journal of autism and developmental disorders.

[15]  K. Pelphrey,et al.  Charting the typical and atypical development of the social brain , 2008, Development and Psychopathology.

[16]  H. Wellman The Child's Theory of Mind , 1990 .

[17]  Stephen W Porges,et al.  Electroencephalogram and heart rate regulation to familiar and unfamiliar people in children with autism spectrum disorders. , 2009, Child development.

[18]  R. Logier,et al.  [Heart rate variability. Applications in psychiatry]. , 2009, L'Encephale.

[19]  Irini Giannopulu,et al.  Emergent emotional and verbal strategies in autism are based on multimodal interactions with toy robots in free spontaneous game play , 2013, 2013 IEEE RO-MAN.

[20]  Tomio Watanabe,et al.  Human-entrained Embodied Interaction and Communication Technology , 2011 .

[21]  R. Adolphs,et al.  Hemispheric perception of emotional valence from facial expressions. , 2001, Neuropsychology.

[22]  Irini Giannopulu,et al.  From child-robot interaction to child-robot-therapist interaction: A case study in autism , 2012, HRI 2012.

[23]  Angelo Cangelosi,et al.  Grounding language in action and perception: from cognitive agents to humanoid robots. , 2010, Physics of life reviews.

[24]  Lauren E. Libero,et al.  Probing the brain in autism using FMRI and diffusion tensor imaging. , 2011, Journal of visualized experiments : JoVE.

[25]  Guelfi,et al.  DSM-5 : manuel diagnostique et statistique des troubles mentaux , 2015 .

[26]  Carter Wendelken,et al.  A functional and structural study of emotion and face processing in children with autism , 2009, Psychiatry Research: Neuroimaging.

[27]  Irini Giannopulu,et al.  Multimodal interactions in typically and atypically developing children: natural versus artificial environments , 2013, Cognitive Processing.

[28]  John Patrick Aggleton,et al.  The Amygdala : a functional analysis , 2000 .

[29]  S. Porges The polyvagal perspective , 2007, Biological Psychology.

[30]  E. Schopler,et al.  Toward objective classification of childhood autism: Childhood Autism Rating Scale (CARS) , 1980, Journal of autism and developmental disorders.

[31]  Andrea C. Pierno,et al.  Robotic movement elicits visuomotor priming in children with autism , 2008, Neuropsychologia.

[32]  M. Toichi,et al.  Paradoxical Autonomic Response to Mental Tasks in Autism , 2003, Journal of autism and developmental disorders.

[33]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[34]  Xue Ming,et al.  Reduced cardiac parasympathetic activity in children with autism , 2005, Brain and Development.