Automated Student Engagement Monitoring and Evaluation during Learning in the Wild

With the explosive growth of edge computing and massive open online courses (MOOCs), there is an urgent need to enable pervasive learning so that students could study with high efficiency at any comfortable places at their own pace. Although there have been a number of studies and applications for student engagement monitoring and evaluation in the pervasive learning, most of the existing works are either supported by commercial eye tracking devices/software or designed for off-line studies on the basis of questionnaires, self-reports, checklists, quizzes, teacher introspective evaluations, and assignments. In this work, we investigate the feasibility of real-time student engagement monitoring and evaluation with low-cost off-the-shelf web-cameras in realistic learning scenarios. To recognizing and evaluating student engagement, a new model is developed and trained by a deep learning Convolutional Neural Network (CNN) with an open source dataset. The quantitative experimental results demonstrate that the deep learning CNN and our model work well and efficiently when monitoring student learning and detecting student engagement in real time.

[1]  Nicu Sebe,et al.  Combining Head Pose and Eye Location Information for Gaze Estimation , 2012, IEEE Transactions on Image Processing.

[2]  Stefaan Ternier,et al.  Learning pulse: a machine learning approach for predicting performance in self-regulated learning using multimodal data , 2017, LAK.

[3]  Li Zhaoping,et al.  Gaze capture by eye-of-origin singletons: interdependence with awareness. , 2012, Journal of vision.

[4]  Qiang Ji,et al.  Real Time Eye Gaze Tracking with 3D Deformable Eye-Face Model , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[5]  Kai Wang,et al.  Deep Recurrent Multi-instance Learning with Spatio-temporal Features for Engagement Intensity Prediction , 2018, ICMI.

[6]  Haijun Kang Understanding online reading through the eyes of first and second language readers: An exploratory study , 2014, Comput. Educ..

[7]  Gjorgji Madjarov,et al.  Hand Gesture Recognition Using Deep Convolutional Neural Networks , 2016, ICT Innovations.

[8]  Feng Lu,et al.  Appearance-Based Gaze Estimation via Evaluation-Guided Asymmetric Regression , 2018, ECCV.

[9]  Mario Fritz,et al.  MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[11]  Iain Matthews,et al.  Passive Driver Gaze Tracking with Active Appearance Models (特集 センシング技術) , 2004 .

[12]  Gloria Yi-Ming Kao,et al.  Reading behavior and the effect of embedded selfies in role-playing picture e-books: An eye-tracking investigation , 2019, Comput. Educ..

[13]  Chinchu Thomas Multimodal Teaching and Learning Analytics for Classroom and Online Educational Settings , 2018, ICMI.

[14]  Otmar Hilliges,et al.  Learning to find eye region landmarks for remote gaze estimation in unconstrained settings , 2018, ETRA.

[15]  Narendra Ahuja,et al.  Appearance-based eye gaze estimation , 2002, Sixth IEEE Workshop on Applications of Computer Vision, 2002. (WACV 2002). Proceedings..

[16]  Andreas Bulling,et al.  EyeTab: model-based gaze estimation on unmodified tablet computers , 2014, ETRA.

[17]  Shariq Iqbal,et al.  Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices , 2018, bioRxiv.

[18]  Marcelo Worsley,et al.  Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex learning tasks , 2016, J. Learn. Anal..

[19]  Andrej Kosir,et al.  Predicting students’ attention in the classroom from Kinect facial and body features , 2017, EURASIP J. Image Video Process..

[20]  Sidney K. D'Mello,et al.  The Eyes Have It: Gaze-based Detection of Mind Wandering during Learning with an Intelligent Tutoring System , 2016, EDM.

[21]  Anthony J. Hornof,et al.  An active vision computational model of visual search for human-computer interaction , 2008 .

[22]  Vineeth Balasubramanian,et al.  DAISEE: Dataset for Affective States in E-Learning Environments , 2016, ArXiv.

[23]  Len Hamey,et al.  Automatic Recognition of Student Engagement Using Deep Learning and Facial Expression , 2018, ECML/PKDD.

[24]  Zhihui Lu,et al.  Selective encryption on ECG data in body sensor network based on supervised machine learning , 2020, Inf. Fusion.

[25]  Shigang Li,et al.  Eye-Model-Based Gaze Estimation by RGB-D Camera , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[26]  Ming Yang,et al.  DeepFace: Closing the Gap to Human-Level Performance in Face Verification , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[27]  Angela Stewart,et al.  Gaze-based Detection of Mind Wandering during Lecture Viewing , 2017, EDM.

[28]  Yuen-Hsien Tseng,et al.  Tracking learners' visual attention during a multimedia presentation in a real classroom , 2013, Comput. Educ..

[29]  Peter Robinson,et al.  A 3D Morphable Eye Region Model for Gaze Estimation , 2016, ECCV.