Social Visual Behavior Analytics for Autism Therapy of Children Based on Automated Mutual Gaze Detection

Social visual behavior, as a type of non-verbal communication, plays a central role in studying social cognitive processes in interactive and complex settings of autism therapy interventions. However, for social visual behavior analytics in children with autism, it is challenging to collect gaze data manually and evaluate them because it costs a lot of time and effort for human coders. In this paper, we introduce a social visual behavior analytics approach by quantifying the mutual gaze performance of children receiving play-based autism interventions using an automated mutual gaze detection framework. Our analysis is based on a video dataset that captures and records social interactions between children with autism and their therapy trainers (N=28 observations, 84 video clips, 21 Hrs duration). The effectiveness of our framework was evaluated by comparing the mutual gaze ratio derived from the mutual gaze detection framework with the human-coded ratio values. We analyzed the mutual gaze frequency and duration across different therapy settings, activities, and sessions. We created mutual gaze-related measures for social visual behavior score prediction using multiple machine learning-based regression models. The results show that our method provides mutual gaze measures that reliably represent (or even replace) the human coders' hand-coded social gaze measures and effectively evaluates and predicts ASD children's social visual performance during the intervention. Our findings have implications for social interaction analysis in small-group behavior assessments in numerous co-located settings in (special) education and in the workplace.

[1]  A. Bhat,et al.  Dyadic Movement Synchrony Estimation Under Privacy-preserving Conditions , 2022, 2022 26th International Conference on Pattern Recognition (ICPR).

[2]  A. Bhat,et al.  Pose Uncertainty Aware Movement Synchrony Estimation via Spatial-Temporal Graph Transformer , 2022, ICMI.

[3]  Xiaoshu Xu,et al.  The influence of picture book design on visual attention of children with autism: a pilot study , 2022, International journal of developmental disabilities.

[4]  Roghayeh Barmaki,et al.  An Automated Mutual Gaze Detection Framework for Social Behavior Assessment in Therapy for Children with Autism , 2021, ICMI.

[5]  Brandon M. Booth,et al.  A Conceptual Framework for Investigating and Mitigating Machine-Learning Measurement Bias (MLMB) in Psychological Assessment , 2021, Advances in Methods and Practices in Psychological Science.

[6]  Roghayeh Barmaki,et al.  A Two-stage Multi-modal Affect Analysis Framework for Children with Autism Spectrum Disorder , 2021, AffCon@AAAI.

[7]  F. Mantovani,et al.  Assessment of the Autism Spectrum Disorder Based on Machine Learning and Social Visual Attention: A Systematic Review , 2021, Journal of Autism and Developmental Disorders.

[8]  D. Tao,et al.  Characteristics of Visual Fixation in Chinese Children with Autism During Face-to-Face Conversations , 2021, Journal of Autism and Developmental Disorders.

[9]  A. Antolí,et al.  Visual preference for social vs. non-social images in young children with autism spectrum disorders. An eye tracking study , 2021, PloS one.

[10]  Chen-Nee Chuah,et al.  Predicting ASD diagnosis in children with synthetic and image-based eye gaze data , 2021, Signal Process. Image Commun..

[11]  Niels Pinkwart,et al.  Serious games to improve social and emotional intelligence in children with autism , 2021, Entertain. Comput..

[12]  Xin Guo,et al.  Multi-rate Attention Based GRU Model for Engagement Prediction , 2020, ICMI.

[13]  Jonathan P. Rowe,et al.  Early Prediction of Visitor Engagement in Science Museums with Multimodal Learning Analytics , 2020, ICMI.

[14]  Roghayeh Barmaki,et al.  Deep neural networks for collaborative learning analytics: Evaluating team collaborations using student gaze point prediction , 2020, Australasian Journal of Educational Technology.

[15]  Gijs A. Holleman,et al.  Looking behavior and potential human interactions during locomotion , 2020, Journal of vision.

[16]  Rick Dale,et al.  Multimodal Coordination of Sound and Movement in Music and Speech , 2020, Discourse Processes.

[17]  Shirley Xin Li,et al.  Effectiveness of Using Mobile Technology to Improve Cognitive and Social Skills Among Individuals With Autism Spectrum Disorder: Systematic Literature Review , 2020, JMIR mental health.

[18]  Enkelejda Kasneci,et al.  RemoteEye: An open-source high-speed remote eye tracker , 2020, Behavior Research Methods.

[19]  Gerhard Nahler,et al.  Pearson Correlation Coefficient , 2020, Definitions.

[20]  Gijs A. Holleman,et al.  Task-related gaze control in human crowd navigation , 2020, Attention, perception & psychophysics.

[21]  S. Fiedler,et al.  Understanding cognitive and affective mechanisms in social psychology through eye-tracking , 2019, Journal of Experimental Social Psychology.

[22]  Andrew Zisserman,et al.  LAEO-Net: Revisiting People Looking at Each Other in Videos , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[23]  A. Hamilton,et al.  The Role of Eye Gaze During Natural Social Interactions in Typical and Autistic People , 2019, Front. Psychol..

[24]  Shervin Minaee,et al.  Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional Network , 2019, Sensors.

[25]  Shenghua Gao,et al.  Believe It or Not, We Know What You Are Looking At! , 2018, ACCV.

[26]  J. Ristic,et al.  How attention gates social interactions , 2018, Annals of the New York Academy of Sciences.

[27]  Tim H. W. Cornelissen,et al.  Eye contact takes two – autistic and social anxiety traits predict gaze behavior in dyadic interaction , 2018 .

[28]  Tiago H. Falk,et al.  Visual Nonverbal Behavior Analysis: The Path Forward , 2018, IEEE MultiMedia.

[29]  D. Moore,et al.  Gaze Patterns of Individuals with ASD During Active Task Engagement: a Systematic Literature Review , 2018 .

[30]  Eric W. Klingemier,et al.  A Meta-Analysis of Gaze Differences to Social and Nonsocial Information Between Individuals With and Without Autism. , 2017, Journal of the American Academy of Child and Adolescent Psychiatry.

[31]  T. Falck-Ytter,et al.  Gaze Following in Children with Autism: Do High Interest Objects Boost Performance? , 2016, Journal of autism and developmental disorders.

[32]  Suman Saha,et al.  Online Real-Time Multiple Spatiotemporal Action Localisation and Prediction , 2016, 2017 IEEE International Conference on Computer Vision (ICCV).

[33]  Eric T. Greenlee,et al.  Which Eye Tracker Is Right for Your Research? Performance Evaluation of Several Cost Variant Eye Trackers , 2016 .

[34]  E. von dem Hagen,et al.  High autistic trait individuals do not modulate gaze behaviour in response to social presence but look away more when actively engaged in an interaction , 2016, Autism research : official journal of the International Society for Autism Research.

[35]  Timothy Gifford,et al.  The effects of embodied rhythm and robotic interventions on the spontaneous and responsive verbal communication skills of children with Autism Spectrum Disorder (ASD): A further outcome of a pilot randomized controlled trial. , 2016, Research in autism spectrum disorders.

[36]  Yuan Yu,et al.  TensorFlow: A system for large-scale machine learning , 2016, OSDI.

[37]  Wei Liu,et al.  SSD: Single Shot MultiBox Detector , 2015, ECCV.

[38]  Yoshihiro Miyake,et al.  Look at you, look at me: Detection and analysis of mutual gaze convergence in face-to-face interaction , 2015, 2015 IEEE/SICE International Symposium on System Integration (SII).

[39]  R. Schultz,et al.  Measuring social attention and motivation in autism spectrum disorder using eye‐tracking: Stimulus type matters , 2015, Autism research : official journal of the International Society for Autism Research.

[40]  James M. Rehg,et al.  Detecting bids for eye contact using a wearable camera , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[41]  Anibal Gutierrez,et al.  Comparing the gaze responses of children with autism and typically developed individuals in human-robot interaction , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[42]  B. Rogé,et al.  Visual social attention in autism spectrum disorder: Insights from eye tracking studies , 2014, Neuroscience & Biobehavioral Reviews.

[43]  Andrew Zisserman,et al.  Detecting People Looking at Each Other in Videos , 2014, International Journal of Computer Vision.

[44]  James M. Rehg,et al.  Detecting eye contact using wearable eye-tracking glasses , 2012, UbiComp.

[45]  Horst Bischof,et al.  Annotated Facial Landmarks in the Wild: A large-scale, real-world database for facial landmark localization , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[46]  A. Rozga,et al.  A prospective study of the emergence of early behavioral signs of autism. , 2010, Journal of the American Academy of Child and Adolescent Psychiatry.

[47]  W. Stone,et al.  Predicting Social Impairment and ASD Diagnosis in Younger Siblings of Children with Autism Spectrum Disorder , 2009, Journal of autism and developmental disorders.

[48]  Geraldine Dawson,et al.  Characteristics Associated with Presence of Depressive Symptoms in Adults with Autism Spectrum Disorder , 2008, Journal of autism and developmental disorders.

[49]  Hatice Gunes,et al.  Bi-modal emotion recognition from expressive face and body gestures , 2007, J. Netw. Comput. Appl..

[50]  S. Blakemore,et al.  The application of eye‐tracking technology in the study of autism , 2007, The Journal of physiology.

[51]  Daniel C. Richardson,et al.  Looking To Understand: The Coupling Between Speakers' and Listeners' Eye Movements and Its Relationship to Discourse Comprehension , 2005, Cogn. Sci..

[52]  G. Dawson,et al.  Early social attention impairments in autism: social orienting, joint attention, and attention to distress. , 2004, Developmental psychology.

[53]  I. Peretz,et al.  Enhanced Pitch Sensitivity in Individuals with Autism: A Signal Detection Analysis , 2003, Journal of Cognitive Neuroscience.

[54]  Geraldine Dawson,et al.  Early recognition of 1-year-old infants with autism spectrum disorder versus mental retardation , 2002, Development and Psychopathology.

[55]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[56]  Connie Kasari,et al.  Joint attention, developmental level, and symptom presentation in autism , 1994, Development and Psychopathology.

[57]  Linda C. Mayes,et al.  Gaze behavior in autism , 1990, Development and Psychopathology.

[58]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[59]  D E Yoder,et al.  Gaze behavior: A new look at an old problem , 1983, Journal of autism and developmental disorders.

[60]  L. Wing,et al.  Severe impairments of social interaction and associated abnormalities in children: Epidemiology and classification , 1979, Journal of autism and developmental disorders.

[61]  Zhang Guo,et al.  Collaboration Analysis Using Object Detection , 2019, EDM.

[62]  Lucas P. J. J. Noldus,et al.  Automatic mutual gaze detection in face-to-face dyadic interaction videos , 2018 .

[63]  Meia Chita-Tegmark,et al.  Social attention in ASD: A review and meta-analysis of eye-tracking studies. , 2016, Research in developmental disabilities.

[64]  Claes von Hofsten,et al.  How special is social looking in ASD: a review. , 2011, Progress in brain research.

[65]  Ian D. Reid,et al.  High Five: Recognising human interactions in TV shows , 2010, BMVC.

[66]  Brian Scassellati,et al.  How Social Robots Will Help Us to Diagnose, Treat, and Understand Autism , 2005, ISRR.

[67]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[68]  J. Buitelaar,et al.  Attachment and Social Withdrawal in Autism: Hypotheses and Findings , 1995 .

[69]  M. Rutter Diagnosis and Definition , 1978 .