Actionable Affective Processing for Automatic Tutor Interventions

Once a tutoring system is able to detect students’ emotions, it is not obvious how to change the tutor’s behavior to leverage this emotion detection for the benefit of the student. For instance, if students state that they are excited, then providing harder problems may be appropriate in one case, while providing actions to calm them down so that they can better focus may be the best response in other cases. Both the cognitive and emotional states are important when choosing the tutor’s actions. The purpose of this chapter is to describe the elements necessary for a tutoring system that makes appropriate actions based on a detected affective state. This is broken down into three parts. First we describe several methods for emotion detection. Then we present a study using Wayang Outpost, our math tutor, using sensors to detect the student’s emotion and taking actions based on that emotion. Then we discuss potential actions for the detected emotions. We conclude with future steps needed to improve the actions of tutoring systems in general.

[1]  David A. van Leeuwen,et al.  Automatic discrimination between laughter and speech , 2007, Speech Commun..

[2]  Arthur C. Graesser,et al.  Cohesion Relationships in Tutorial Dialogue as Predictors of Affective States , 2009, AIED.

[3]  Tom Murray,et al.  Toward Measuring and Maintaining the Zone of Proximal Development in Adaptive Instructional Systems , 2002, Intelligent Tutoring Systems.

[4]  Kasia Muldner,et al.  The Effect of Motivational Learning Companions on Low Achieving Students and Students with Disabilities , 2010, Intelligent Tutoring Systems.

[5]  Kasia Muldner,et al.  Ranking Feature Sets for Emotion Models Used in Classroom Based Intelligent Tutoring Systems , 2010, UMAP.

[6]  Piotr S. Szczepaniak,et al.  Computational intelligence and applications , 1999 .

[7]  Ian R. Fasel,et al.  Auditory mood detection for social and educational robots , 2008, 2008 IEEE International Conference on Robotics and Automation.

[8]  P. Robinson,et al.  Natural Affect Data: Collection and Annotation , 2011 .

[9]  R. Nkambou,et al.  A Framework for Affective Intelligent Tutoring Systems , 2006, 2006 7th International Conference on Information Technology Based Higher Education and Training.

[10]  Amy M. Witherspoon,et al.  Detection of Emotions during Learning with AutoTutor , 2006 .

[11]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Mahmoud Neji,et al.  The integration of an emotional system in the intelligent system , 2005, The 3rd ACS/IEEE International Conference onComputer Systems and Applications, 2005..

[13]  Kasia Muldner,et al.  "Yes!": Using Tutor and Sensor Data to Predict Moments of Delight during Instructional Activities , 2010, UMAP.

[14]  Chao Fan,et al.  See Me, Teach Me: Facial Expression and Gesture Recognition for Intelligent Tutoring Systems , 2006, 2006 Innovations in Information Technology.

[15]  Regan L. Mandryk,et al.  A continuous and objective evaluation of emotional experience with interactive play environments , 2006, CHI.

[16]  P. Ekman,et al.  Autonomic nervous system activity distinguishes among emotions. , 1983, Science.

[17]  Claude Frasson,et al.  Predicting Learner Answers Correctness through Brainwaves Assesment and Emotional Dimensions , 2009, AIED.

[18]  Roddy Cowie,et al.  What a neural net needs to know about emotion words , 1999 .

[19]  Sara Värlander,et al.  The role of students’ emotions in formal feedback situations , 2008 .

[20]  Radboud Winkels,et al.  ITS'88: The International Conference on Intelligent Tutoring Systems: Montréal, Canada, June 1--3, 1988 , 1988 .

[21]  Kasia Muldner,et al.  Emotion Sensors Go To School , 2009, AIED.

[22]  C. Dweck Self-Theories: Their Role in Motivation, Personality, and Development. Essays in Social Psychology. , 1999 .

[23]  Peter Robinson,et al.  Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[24]  Beverly Park Woolf,et al.  Affective Gendered Learning Companions , 2009, AIED.

[25]  Carson Reynolds,et al.  The HandWave Bluetooth Skin Conductance Sensor , 2005, ACII.

[26]  Arthur C. Graesser,et al.  Automatic detection of learner’s affect from conversational cues , 2008, User Modeling and User-Adapted Interaction.

[27]  James C. Lester,et al.  Early Prediction of Student Frustration , 2007, ACII.

[28]  Kasia Muldner,et al.  Sensors Model Student Self Concept in the Classroom , 2009, UMAP.

[29]  Claude Frasson,et al.  Theoretical Model for Interplay between Some Learning Situations and Brainwaves , 2010, Intelligent Tutoring Systems.

[30]  Yuan Qi,et al.  Context-sensitive Bayesian classifiers and application to mouse pressure pattern classification , 2002, Object recognition supported by user interaction for service robots.

[31]  Claude Frasson,et al.  Players' Motivation and EEG Waves Patterns in a Serious Game Environment , 2010, Intelligent Tutoring Systems.

[32]  Arthur C. Graesser,et al.  Mind and Body: Dialogue and Posture for Affect Detection in Learning Environments , 2007, AIED.

[33]  Rosalind W. Picard,et al.  Automated Posture Analysis for Detecting Learner's Interest Level , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[34]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..