Toward Continuous Social Phenotyping: Analyzing Gaze Patterns in an Emotion Recognition Task for Children With Autism Through Wearable Smart Glasses

Background Several studies have shown that facial attention differs in children with autism. Measuring eye gaze and emotion recognition in children with autism is challenging, as standard clinical assessments must be delivered in clinical settings by a trained clinician. Wearable technologies may be able to bring eye gaze and emotion recognition into natural social interactions and settings. Objective This study aimed to test: (1) the feasibility of tracking gaze using wearable smart glasses during a facial expression recognition task and (2) the ability of these gaze-tracking data, together with facial expression recognition responses, to distinguish children with autism from neurotypical controls (NCs). Methods We compared the eye gaze and emotion recognition patterns of 16 children with autism spectrum disorder (ASD) and 17 children without ASD via wearable smart glasses fitted with a custom eye tracker. Children identified static facial expressions of images presented on a computer screen along with nonsocial distractors while wearing Google Glass and the eye tracker. Faces were presented in three trials, during one of which children received feedback in the form of the correct classification. We employed hybrid human-labeling and computer vision–enabled methods for pupil tracking and world–gaze translation calibration. We analyzed the impact of gaze and emotion recognition features in a prediction task aiming to distinguish children with ASD from NC participants. Results Gaze and emotion recognition patterns enabled the training of a classifier that distinguished ASD and NC groups. However, it was unable to significantly outperform other classifiers that used only age and gender features, suggesting that further work is necessary to disentangle these effects. Conclusions Although wearable smart glasses show promise in identifying subtle differences in gaze tracking and emotion recognition patterns in children with and without ASD, the present form factor and data do not allow for these differences to be reliably exploited by machine learning systems. Resolving these challenges will be an important step toward continuous tracking of the ASD phenotype.

[1]  Daniel J. Faso,et al.  Context Effects on Facial Affect Recognition in Schizophrenia and Autism: Behavioral and Eye-Tracking Evidence. , 2016, Schizophrenia bulletin.

[2]  J. Constantino,et al.  Prevalence and Characteristics of Autism Spectrum Disorder Among Children Aged 8 Years — Autism and Developmental Disabilities Monitoring Network, 11 Sites, United States, 2012 , 2018, Morbidity and mortality weekly report. Surveillance summaries.

[3]  Aubyn C. Stahmer,et al.  Naturalistic Developmental Behavioral Interventions: Empirically Validated Treatments for Autism Spectrum Disorder , 2015, Journal of Autism and Developmental Disorders.

[4]  Peter Washington,et al.  Detecting Developmental Delay and Autism Through Machine Learning Models Using Home Videos of Bangladeshi Children: Development and Validation Study , 2019, Journal of medical Internet research.

[5]  Laurence Chaby,et al.  A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders , 2015, Front. Psychol..

[6]  Thomas Brox,et al.  High Accuracy Optical Flow Estimation Based on a Theory for Warping , 2004, ECCV.

[7]  H. Engeland,et al.  Gaze behavior of children with pervasive developmental disorder toward human faces: a fixation time study. , 2002, Journal of child psychology and psychiatry, and allied disciplines.

[8]  S. Serret,et al.  Facing the challenge of teaching emotions to individuals with low- and high-functioning autism using a new Serious game: a pilot study , 2014, Molecular Autism.

[9]  Peter Washington,et al.  Mobile detection of autism through machine learning on home video: A development and prospective validation study , 2018, PLoS medicine.

[10]  Vanessa Lobue,et al.  The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults , 2014, Front. Psychol..

[11]  Amy L. Donaldson,et al.  Randomized, Controlled Trial of an Intervention for Toddlers With Autism: The Early Start Denver Model , 2010, Pediatrics.

[12]  Haik Kalantarian,et al.  Data-Driven Diagnostics and the Potential of Mobile Artificial Intelligence for Digital Therapeutic Phenotyping in Computational Psychiatry. , 2019, Biological psychiatry. Cognitive neuroscience and neuroimaging.

[13]  G. Dawson,et al.  Understanding the Nature of Face Processing Impairment in Autism: Insights From Behavioral and Electrophysiological Studies , 2005, Developmental neuropsychology.

[14]  E. Strauss,et al.  NEPSY-II: A Developmental Neuropsychological Assessment, Second Edition , 2009 .

[15]  M. Knapp,et al.  Costs of autism spectrum disorders in the United Kingdom and the United States. , 2014, JAMA pediatrics.

[16]  S. Bölte,et al.  The Social Communication Questionnaire (SCQ) as a screener for autism spectrum disorders: additional evidence and cross-cultural validity. , 2008, Journal of the American Academy of Child and Adolescent Psychiatry.

[17]  S. Bölte,et al.  Emotion Recognition in Children and Adolescents with Autism Spectrum Disorders , 2009, Journal of autism and developmental disorders.

[18]  G. Dichter,et al.  Reward circuitry function in autism spectrum disorders. , 2012, Social cognitive and affective neuroscience.

[19]  Elina Birmingham,et al.  Are emotion recognition abilities related to everyday social functioning in ASD? A meta-analysis , 2016 .

[20]  H. Heekeren,et al.  Atypical Reflexive Gaze Patterns on Emotional Faces in Autism Spectrum Disorders , 2010, The Journal of Neuroscience.

[21]  Eric W. Klingemier,et al.  A Meta-Analysis of Gaze Differences to Social and Nonsocial Information Between Individuals With and Without Autism. , 2017, Journal of the American Academy of Child and Adolescent Psychiatry.

[22]  Peter Washington,et al.  SuperpowerGlass , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[23]  Arshya Vahabzadeh,et al.  Second Version of Google Glass as a Wearable Socio-Affective Aid: Positive School Desirability, High Usability, and Theoretical Framework in a Sample of Children with Autism , 2018, JMIR human factors.

[24]  Emma Ashwin,et al.  Can emotion recognition be taught to children with autism spectrum conditions? , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[25]  Peter Washington,et al.  Labeling images with facial emotion and the potential for pediatric healthcare , 2019, Artif. Intell. Medicine.

[26]  Peter Washington,et al.  Effect of Wearable Digital Intervention for Improving Socialization in Children With Autism Spectrum Disorder: A Randomized Clinical Trial , 2019, JAMA pediatrics.

[27]  J. Piven,et al.  Visual Scanning of Faces in Autism , 2002, Journal of autism and developmental disorders.

[28]  Peter Washington,et al.  The Performance of Emotion Classifiers for Children With Parent-Reported Autism: Quantitative Feasibility Study , 2020, JMIR mental health.

[29]  Sven Bölte,et al.  Facial affect recognition in autism, ADHD and typical development , 2016, Cognitive neuropsychiatry.

[30]  Gale H. Roid,et al.  Essentials of Stanford-Binet intelligence scales (SB5) assessment , 2004 .

[31]  Peter Washington,et al.  Superpower glass: delivering unobtrusive real-time social cues in wearable systems , 2016, UbiComp Adjunct.

[32]  M. Maglione,et al.  Nonmedical Interventions for Children With ASD: Recommended Guidelines and Further Research Needs , 2012, Pediatrics.

[33]  Peter Washington,et al.  A Wearable Social Interaction Aid for Children with Autism , 2016, CHI Extended Abstracts.

[34]  F. Castelli,et al.  Understanding emotions from standardized facial expressions in autism and normal development , 2005, Autism : the international journal of research and practice.

[35]  Erhardt Barth,et al.  Accurate Eye Centre Localisation by Means of Gradients , 2011, VISAPP.

[36]  A. Angwin,et al.  Effects of Prosodic and Semantic Cues on Facial Emotion Recognition in Relation to Autism-Like Traits , 2018, Journal of Autism and Developmental Disorders.

[37]  A. Guastella,et al.  Reduced eye gaze explains "fear blindness" in childhood psychopathic traits. , 2008, Journal of the American Academy of Child and Adolescent Psychiatry.

[38]  Maureen S. Durkin,et al.  Prevalence and Characteristics of Autism Spectrum Disorder Among 4-Year-Old Children in the Autism and Developmental Disabilities Monitoring Network , 2016, Journal of developmental and behavioral pediatrics : JDBP.

[39]  Noah J. Sasson,et al.  Children with autism demonstrate circumscribed attention during passive viewing of complex social and nonsocial picture arrays , 2008, Autism research : official journal of the International Society for Autism Research.

[40]  Alex Martin,et al.  Facial Emotion Recognition in Autism Spectrum Disorders: A Review of Behavioral and Neuroimaging Studies , 2010, Neuropsychology Review.

[41]  Peter Washington,et al.  Exploratory study examining the at-home feasibility of a wearable tool for social-affective learning in children with autism , 2018, npj Digital Medicine.

[42]  Ofer Golan,et al.  Basic and complex emotion recognition in children with autism: cross-cultural findings , 2016, Molecular Autism.

[43]  D. Skuse,et al.  Avoidance of emotionally arousing stimuli predicts social–perceptual impairment in Asperger's syndrome , 2008, Neuropsychologia.

[44]  S. Porges,et al.  Emotion Recognition in Children with Autism Spectrum Disorders: Relations to Eye Gaze and Autonomic State , 2010, Journal of autism and developmental disorders.

[45]  Arshya Vahabzadeh,et al.  Social Communication Coaching Smartglasses: Well Tolerated in a Diverse Sample of Children and Adults With Autism , 2017, JMIR mHealth and uHealth.

[46]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[47]  Joseph Piven,et al.  Abnormal Use of Facial Information in High-Functioning Autism , 2007, Journal of autism and developmental disorders.

[48]  Horst Bischof,et al.  Efficient Maximally Stable Extremal Region (MSER) Tracking , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[49]  Arshya Vahabzadeh,et al.  Improvement of Attention-Deficit/Hyperactivity Disorder Symptoms in School-Aged Children, Adolescents, and Young Adults With Autism via a Digital Smartglasses-Based Socioemotional Coaching Aid: Short-Term, Uncontrolled Pilot Study , 2018, JMIR mental health.

[50]  Peter Washington,et al.  Feasibility Testing of a Wearable Behavioral Aid for Social Learning in Children with Autism , 2018, Applied Clinical Informatics.