Identification of Action Units Related to Affective States in a Tutoring System for Mathematics

Introduction A facial expression is a fundamental component of non-verbal human communication. It not only conveys responses to external stimuli but also motivates action and adds meaning and richness to human experiences. Affective computing (Picard, 2000) is a branch of computer sciences aimed at endowing technology with mechanisms for the recognition of, understanding of and reaction to human emotions. Emotions encompass a number of psychological and physiological reactions used as cues by others in non-verbal communication. These cues are a fundamental part of human communication and provide the rudiments to develop technology to detect emotions automatically. The recognition of emotions is an open problem in the computer sciences (El Kaliouby & Robinson, 2005). Emotions influence all human activities, and the learning context whether in the classroom or with educational technology is no exception. Previous works have employed technology to recognize emotions by processing facial expressions, speech and other physiological signals (Shen et al., 2009). Of particular relevance to this investigation is the work of Craig et al. (2004) as it provides indications of the importance of emotions in the learning process mediated by educational technology. Craig et al. (2004) distinguished between emotions expressed in non-classroom settings such as joy or sadness and the emotions displayed during the learning process mediated by educational technology. As a consequence, Craig et al. (2004) proposed that only a sub-set of emotions is present in the educational context. Following this distinction and with the aim of distinguishing between the bigger set of emotions and those specific to educational technology, the term "affective states" is employed in this paper to refer to boredom, frustration, confusion and cognitive engagement or flow (Craig et al., 2004). Although the expression of affect by students is a normal part of their interaction with educational technology, they could be more inclined to stronger affective reactions in situations involving problem-solving activities. In these contexts, educational technology is designed to encourage the resolutions of mathematical problems accompanied by cycles of mistakes and recovering from these mistakes. During these cycles, students could experience one or all the affective states as the activities proposed by the tutoring systems often offer challenging activities (D'Mello et al., 2010). To fully endow educational technology with affective computing capabilities it is necessary to develop tools to recognize, understand and react to students' affect. The recognition of affective states has remained elusive and focused on physiological and psychological aspects related to affect. Previous research on affect has mainly been directed at the recognition problem because of the difficulties and ambiguities associated with the recognition of human emotions (Ocumpaugh et al., 2012). Some progress has been made, but a reliable, autonomous and context-independent solution to recognize affect during the use of educational technology does not exist. One of the problems with the development of such a tool is the context in which recognition takes place. This paper assumes that cultural differences among students (Terzis et al., 2013) add difficulties to the development of a generalized affect detector. This paper deals with the problem of recognition of affective states, aiming at providing evidence that the employment of action units (conceived as universal facial reactions) could aid the recognition and understanding of students' affective states based on facial movements. The findings presented in this paper also support the idea that students undergo affective trajectories and suggest these have impact on the students' learning. This paper also offers pointers in relation to affective display in a Mexican secondary school but does not elaborate on how these affective displays compare to students with a different cultural background. …

[1]  J. Brady,et al.  The Belmont Report. Ethical principles and guidelines for the protection of human subjects of research. , 2014, The Journal of the American College of Dentists.

[2]  Sidney K. D'Mello,et al.  Monitoring Affect States During Effortful Problem Solving Activities , 2010, Int. J. Artif. Intell. Educ..

[3]  Brandon G. King,et al.  Facial Features for Affective State Detection in Learning Environments , 2007 .

[4]  Ryan Shaun Joazeiro de Baker,et al.  Collaboration in cognitive tutor use in latin America: field study and design recommendations , 2012, CHI.

[5]  D. Zucker The Belmont Report , 2014 .

[6]  Ryan Shaun Joazeiro de Baker,et al.  Exploring the Relationship between Novice Programmer Confusion and Achievement , 2011, ACII.

[7]  Minjuan Wang,et al.  Affective e-Learning: Using "Emotional" Data to Improve Learning in Pervasive Learning Environment , 2009, J. Educ. Technol. Soc..

[8]  Scotty D. Craig,et al.  Affect and learning: An exploratory look into the role of affect in learning with AutoTutor , 2004 .

[9]  Daniel McDuff,et al.  Real-time inference of mental states from facial expressions and upper body gestures , 2011, Face and Gesture 2011.

[10]  Valerie J. Shute,et al.  An exploratory analysis of confusion among students using Newton's playground , 2014, ICCE 2014.

[11]  Ryan Shaun Joazeiro de Baker,et al.  Off-task behavior in the cognitive tutor classroom: when students "game the system" , 2004, CHI.

[12]  Sidney D'Mello,et al.  Confusion and its dynamics during device comprehension with breakdown scenarios. , 2014, Acta psychologica.

[13]  Ryan Shaun Joazeiro de Baker,et al.  The Effects of an Interactive Software Agent on Student Affective Dynamics while Using ;an Intelligent Tutoring System , 2012, IEEE Transactions on Affective Computing.

[14]  Eamonn J. Keogh,et al.  A symbolic representation of time series, with implications for streaming algorithms , 2003, DMKD '03.

[15]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Rosalind W. Picard,et al.  An affective model of interplay between emotions and learning: reengineering educational pedagogy-building a learning companion , 2001, Proceedings IEEE International Conference on Advanced Learning Technologies.

[17]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[18]  Peter Robinson,et al.  Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[19]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[20]  Anastasios A. Economides,et al.  Computer Based Assessment Acceptance: A Cross-cultural Study in Greece and Mexico , 2013, J. Educ. Technol. Soc..

[21]  Rana El Kaliouby,et al.  Smile or smirk? Automatic detection of spontaneous asymmetric smiles to understand viewer experience , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[22]  Vladimir Pavlovic,et al.  Real-Time Vision for Human-Computer Interaction , 2010 .

[23]  Ryan Shaun Joazeiro de Baker,et al.  Adapting to When Students Game an Intelligent Tutoring System , 2006, Intelligent Tutoring Systems.

[24]  P. Danielsson Euclidean distance mapping , 1980 .

[25]  Tom Routen,et al.  Intelligent Tutoring Systems , 1996, Lecture Notes in Computer Science.

[26]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[27]  P. Ekman Facial expressions of emotion: an old controversy and new findings. , 1992, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[28]  Kin-Man Lam,et al.  An efficient algorithm for human face detection and facial feature extraction under different conditions , 2001 .

[29]  A. Graesser,et al.  Dynamics of affective states during complex learning , 2012 .

[30]  Ryan Shaun Joazeiro de Baker,et al.  The Difficulty Factors Approach to the Design of Lessons in Intelligent Tutor Curricula , 2007, Int. J. Artif. Intell. Educ..

[31]  Arthur C. Graesser,et al.  Better to be frustrated than bored: The incidence, persistence, and impact of learners' cognitive-affective states during interactions with three different computer-based learning environments , 2010, Int. J. Hum. Comput. Stud..