Measuring Student Engagement Level Using Facial Information

In this paper, we propose a novel framework that measures the engagement level of students either in a class environment or in an e-learning environment. The proposed framework captures the user’s video and tracks their faces’ through the video’s frames. Different features are extracted from the user’s face e.g., facial fiducial points, head pose, eye gaze, learned features, etc. These features are then used to detect the Facial Action Coding System (FACS), which decomposes facial expressions in terms of the fundamental actions of individual muscles or groups of muscles (i.e., action units). The decoded action units (AU’s) are then used to measures the student’s willingness to participate in the learning process (i.e., behavioral engagement) and his/her emotional attitude towards learning (i.e., emotional engagement). This framework will allow the lecturer to receive a real-time feedback from facial features, gaze, and other body kinesics. The framework is robust and can be utilized in numerous applications including but not limited to the monitoring the progress of students with various degrees of learning disabilities, and the analysis of nerve palsy and its effects on facial expression and social interactions.

[1]  Roland Göcke,et al.  Heart rate estimation from facial videos for depression analysis , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).

[2]  Pamela C. Cosman,et al.  Using face and object detection to quantify looks during social interactions , 2018, 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018).

[3]  H. Bülthoff,et al.  The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions , 2012, PloS one.

[4]  Ting Wang,et al.  Automatic evaluation of the degree of facial nerve paralysis , 2016, Multimedia Tools and Applications.

[5]  Shaun J. Canavan,et al.  BP4D-Spontaneous: a high-resolution spontaneous 3D dynamic facial expression database , 2014, Image Vis. Comput..

[6]  Aly A. Farag,et al.  A facial features detector integrating holistic facial information and part-based model , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[7]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[8]  Aly A. Farag,et al.  Facial action units detection under pose variations using deep regions learning , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).

[9]  Brandon G. King,et al.  Facial Features for Affective State Detection in Learning Environments , 2007 .

[10]  Javier R. Movellan,et al.  The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions , 2014, IEEE Transactions on Affective Computing.

[11]  Tal Hassner,et al.  Effective face frontalization in unconstrained images , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  E. Vakil,et al.  Interpretation of Facial Expressions of Affect in Children with Learning Disabilities with Verbal or Nonverbal Deficits , 1998, Journal of learning disabilities.

[13]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[14]  Shaun J. Canavan,et al.  BP4D-Spontaneous: a high-resolution spontaneous 3D dynamic facial expression database , 2014, Image Vis. Comput..

[15]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.