Towards an Emotional Engagement Model: Can Affective States of a Learner be Automatically Detected in a 1: 1 Learning Scenario?

Existing Intelligent Tutoring Systems (ITSs) are unable to track affective states of learners. In this paper, we focus on the problem of emotional engagement, and propose to detect important affective states (i.e., ‘Satisfied’, ‘Bored’, and ‘Confused’) of a learner in real time. We collected 210 hours of data from 20 students through authentic classroom pilots. The data included information from two modalities: (1) appearance which is collected from the camera, and (2) context-performance that is derived from the content platform. In this paper, data from nine students who attended the learning sessions twice a week are analyzed. We trained separate classifiers for different modalities (appearance and context-performance), and for different types of learning sections (instructional and assessment). The results show that different sources of information are generically better representatives of engagement at different sections: For instructional sections, generic appearance classifier yields higher accuracy (55.79%); whereas context-performance classifier is more accurate for assessment sections (63.41%). Moreover, the results indicate that expression of engagement is person-specific through both of these sources, and personalized engagement models perform more accurately: When personspecific data are added to the training set, on instructional sections, 85.44% and 96.13% accuracies are achieved for appearance and context-performance, respectively. For assessment sections, the accuracies are 75.25% (appearance) and 90.24% (contextperformance). When only person-specific data are employed during training, similar accuracies are achieved even with very limited data. CCS Concepts • Human-centered computing➝Human-computer interaction • Human-centered computing➝Personal computing.

[1]  Beverly Park Woolf,et al.  Affect-aware tutors: recognising and responding to student affect , 2009, Int. J. Learn. Technol..

[2]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[3]  Kasia Muldner,et al.  Emotion Sensors Go To School , 2009, AIED.

[4]  Masoud Yazdani,et al.  Intelligent tutoring systems: An overview , 1986 .

[5]  Peter H. Tu,et al.  Learning person-specific models for facial expression and action unit recognition , 2013, Pattern Recognit. Lett..

[6]  Carlos Busso,et al.  Analysis of facial features of drivers under cognitive and visual distractions , 2013, 2013 IEEE International Conference on Multimedia and Expo (ICME).

[7]  Maja Pantic,et al.  The first facial expression recognition and analysis challenge , 2011, Face and Gesture 2011.

[8]  Reinhard Pekrun,et al.  Perceived learning environment and students' emotional experiences: A multilevel analysis of mathematics classrooms. , 2007 .

[9]  Daniel McDuff,et al.  Exploring Temporal Patterns in Classifying Frustrated and Delighted Smiles , 2012, IEEE Trans. Affect. Comput..

[10]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[12]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[13]  Reinhard Pekrun,et al.  Emotion in Education , 2007 .

[14]  Sidney K. D'Mello,et al.  It's Written on Your Face: Detecting Affective States from Facial Expressions while Learning Computer Programming , 2014, Intelligent Tutoring Systems.

[15]  Charles M. Reigeluth,et al.  Education 3.0: breaking the mold with technology , 2015, Interact. Learn. Environ..

[16]  Kristy Elizabeth Boyer,et al.  Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[17]  Charles M. Reigeluth,et al.  A Trip to the Past and Future of Educational Computing: Understanding Its Evolution , 2011 .

[18]  S. D’Mello A selective meta-analysis on the relative incidence of discrete affective states during learning with technology , 2013 .

[19]  Klaus Krippendorff,et al.  On the Reliability of Unitizing Continuous Data , 1995 .

[20]  Zachary A. Pardos,et al.  Affective states and state tests: investigating how affect throughout the school year predicts end of year learning outcomes , 2013, LAK '13.

[21]  Scotty D. Craig,et al.  Integrating Affect Sensors in an Intelligent Tutoring System , 2004 .

[22]  Ran R. Hassin,et al.  Angry, Disgusted, or Afraid? , 2008, Psychological science.

[23]  George D. Kuh,et al.  Student Engagement and Student Learning: Testing the Linkages* , 2006 .

[24]  Arthur C. Graesser,et al.  Posture as a Predictor of Learner's Affective Engagement , 2007 .

[25]  Chao Chen,et al.  Using Random Forest to Learn Imbalanced Data , 2004 .

[26]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[27]  Jennifer A. Fredricks,et al.  School Engagement: Potential of the Concept, State of the Evidence , 2004 .

[28]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[29]  Nicu Sebe,et al.  Facial expression recognition from video sequences: temporal and static modeling , 2003, Comput. Vis. Image Underst..

[30]  J. Russell A circumplex model of affect. , 1980 .