Estimating Children Engagement Interacting with Robots in Special Education Using Machine Learning

The task of child engagement estimation when interacting with a social robot during a special educational procedure is studied. A multimodal machine learning-based methodology for estimating the engagement of the children with learning difficulties, participating in appropriate designed educational scenarios, is proposed. For this purpose, visual and audio data are gathered during the child-robot interaction and processed towards deciding an engaged state of the child or not. Six single and three ensemble machine learning models are examined for their accuracy in providing confident decisions on in-house developed data. The conducted experiments revealed that, using multimodal data and the AdaBoost Decision Tree ensemble model, the children’s engagement can be estimated with 93.33% accuracy. Moreover, an important outcome of this study is the need for explicitly defining the different engagement meanings for each scenario. The results are very promising and put ahead of the research for closed-loop human centric special education activities using social robots.

[1]  Björn W. Schuller,et al.  CultureNet: A Deep Learning Approach for Engagement Intensity Estimation from Face Images of Children with Autism , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[2]  Sofiane Boucenna,et al.  Evaluating the Engagement with Social Robots , 2015, International Journal of Social Robotics.

[3]  Eric M. Anderman,et al.  Intraindividual Differences in Motivation and Cognition in Students With and Without Learning Disabilities , 1994, Journal of learning disabilities.

[4]  Björn W. Schuller,et al.  Measuring Engagement in Robot-Assisted Autism Therapy: A Cross-Cultural Study , 2017, Front. Robot. AI.

[5]  Joni Dambre,et al.  Leveraging Robotics Research for Children with Autism: A Review , 2018, International Journal of Social Robotics.

[6]  George A. Papakostas,et al.  On Measuring Engagement Level During Child-Robot Interaction in Education , 2019, RiE.

[7]  Uta Frith,et al.  Paradoxes in the definition of dyslexia , 1999 .

[8]  Omar Mubin,et al.  Adaptive Social Robot for Sustaining Social Engagement during Long-Term Children–Robot Interaction , 2017, Int. J. Hum. Comput. Interact..

[9]  Peter Robinson,et al.  Cross-dataset learning and person-specific normalisation for automatic Action Unit detection , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[10]  Paul Vedder,et al.  Goal-Directed Behavior and Contextual Factors in the Classroom: An Innovative Approach to the Study of Multiple Goals , 2006 .

[11]  Agata Rozga,et al.  Using electrodermal activity to recognize ease of engagement in children during social interactions , 2014, UbiComp.

[12]  D. Bergin,et al.  Measuring engagement in fourth to twelfth grade classrooms: the Classroom Engagement Inventory. , 2014, School psychology quarterly : the official journal of the Division of School Psychology, American Psychological Association.

[13]  Georgios D. Sideridis,et al.  On the origins of helpless behavior of students with learning disabilities: avoidance motivation? , 2003 .

[14]  Emily J. Ashurst,et al.  Robot education peers in a situated primary school study: Personalisation promotes child learning , 2017, PloS one.

[15]  S. Ng,et al.  Measuring Engagement at Work: Validation of the Chinese Version of the Utrecht Work Engagement Scale , 2011, International Journal of Behavioral Medicine.

[16]  Gianfranco Borrelli,et al.  Use of a robotic platform in dyslexia-affected pupils: the ROBIN project experience , 2015 .

[17]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[18]  Eleni Vrochidou,et al.  Toward Robot-Assisted Psychosocial Intervention for Children with Autism Spectrum Disorder (ASD) , 2019, ICSR.

[19]  Petros Maragos,et al.  A Deep Learning Approach for Multi-View Engagement Estimation of Children in a Child-Robot Joint Attention Task , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Boris Otto,et al.  Design Principles for Industrie 4.0 Scenarios , 2016, 2016 49th Hawaii International Conference on System Sciences (HICSS).

[21]  Bobby J. Calder,et al.  How to Capture Consumer Experiences: A Context-Specific Approach To Measuring Engagement , 2015, Journal of Advertising Research.

[22]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[23]  Omar Mubin,et al.  A Systematic Review of Adaptivity in Human-Robot Interaction , 2017, Multimodal Technol. Interact..

[24]  Juliet A. Baxter,et al.  Effects of Reform-Based Mathematics Instruction on Low Achievers in Five Third-Grade Classrooms , 2001, The Elementary School Journal.

[25]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[26]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  S. Dika,et al.  Mathematics and Science Achievement: Effects of Motivation, Interest, and Academic Engagement , 2002 .

[28]  R. Ben-Ari,et al.  Restructuring heterogeneous classes for cognitive development: Social interactive perspective , 2000 .

[29]  Nitesh V. Chawla,et al.  SMOTE: Synthetic Minority Over-sampling Technique , 2002, J. Artif. Intell. Res..

[30]  Francesco Palmieri,et al.  Emerging Trends in Machine Learning for Signal Processing , 2017, Computational Intelligence and Neuroscience.

[31]  Jessica L. Tracy,et al.  Development of a Facs-verified Set of Basic and Self-conscious Emotion Expressions Extant Facs-verified Sets , 2009 .

[32]  Efthymios Tzinis,et al.  Engagement detection for children with Autism Spectrum Disorder , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[33]  Yusuke Sugano,et al.  Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery , 2017, UIST.

[34]  Eric N. Wiebe,et al.  Measuring engagement in video game-based environments: Investigation of the User Engagement Scale , 2014, Comput. Hum. Behav..

[35]  Marc Hanheide,et al.  Are You Still With Me? Continuous Engagement Assessment From a Robot's Point of View , 2020, Frontiers in Robotics and AI.

[36]  Khaled Hamdan,et al.  Robot Technology Impact on Dyslexic Students’ English Learning , 2017 .

[37]  George K. Sidiropoulos,et al.  Distance Special Education Delivery by Social Robots , 2020, Electronics.

[38]  George A. Papakostas,et al.  Gender identification through facebook data analysis using machine learning techniques , 2018, PCI.

[39]  Brian Scassellati,et al.  Thinking Aloud with a Tutoring Robot to Enhance Learning , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[40]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[41]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[42]  Yusuke Sugano,et al.  Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications , 2019, CHI.

[43]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[44]  Ana Paiva,et al.  Detecting Engagement in HRI: An Exploration of Social and Task-Based Context , 2012, 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing.

[45]  George A. Papakostas,et al.  Social Robot Selection: A Case Study in Education , 2018, 2018 26th International Conference on Software, Telecommunications and Computer Networks (SoftCOM).

[46]  Georgios D. Sideridis,et al.  Social, motivational, and emotional aspects of learning disabilities , 2005 .

[47]  Tom Fawcett,et al.  An introduction to ROC analysis , 2006, Pattern Recognit. Lett..

[48]  Nikolaos Doulamis,et al.  Deep Learning for Computer Vision: A Brief Review , 2018, Comput. Intell. Neurosci..

[49]  Tony Belpaeme,et al.  Child Speech Recognition in Human-Robot Interaction: Evaluations and Recommendations , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.