Towards Human Affect Modeling: A Comparative Analysis of Discrete Affect and Valence-Arousal Labeling

There is still considerable disagreement on key aspects of affective computing - including even how affect itself is conceptualized. Using a multi-modal student dataset collected while students were watching instructional videos and answering questions on a learning platform, we investigated the two key paradigms of how affect is represented through a comparative approach: (1) Affect as a set of discrete states and (2) Affect as a combination of a two-dimensional space of attributes. We specifically examined a set of discrete learning-related affects (Satisfied, Confused, and Bored) that are hypothesized to map to specific locations within the Valence-Arousal dimensions of Circumplex Model of Emotion. For each of the key paradigms, we had five human experts label student affect on the dataset. We investigated two major research questions using their labels: (1) Whether the hypothesized mappings between discrete affects and Valence-Arousal are valid and (2) whether affect labeling is more reliable with discrete affect or Valence-Arousal. Contrary to the expected, the results show that discrete labels did not directly map to Valence-Arousal quadrants in Circumplex Model of Emotion. This indicates that the experts perceived and labeled these two relatively differently. On the other side, the inter-rater agreement results show that the experts moderately agreed with each other within both paradigms. These results imply that researchers and practitioners should consider how affect information would operationally be used in an intelligent system when choosing from the two key paradigms of affect.

[1]  Kristy Elizabeth Boyer,et al.  Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[2]  J. Russell A circumplex model of affect. , 1980 .

[3]  James C. Lester,et al.  Modeling Learner Affect with Theoretically Grounded Dynamic Bayesian Networks , 2011, ACII.

[4]  Eda Okur,et al.  Human Expert Labeling Process (HELP): Towards a Reliable Higher-order User State Labeling Process and Tool to Assess Student Engagement , 2017 .

[5]  Kasia Muldner,et al.  Emotion Sensors Go To School , 2009, AIED.

[6]  Nese Alyüz,et al.  Behavioral Engagement Detection of Students in the Wild , 2017, AIED.

[7]  Ryan Shaun Joazeiro de Baker,et al.  Human Expert Labeling Process: Valence-Arousal Labeling for Students' Affective States , 2018, MIS4TEL.

[8]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[9]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[10]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[11]  Daniel McDuff,et al.  Exploring Temporal Patterns in Classifying Frustrated and Delighted Smiles , 2012, IEEE Trans. Affect. Comput..

[12]  Supreeth M. Gowda,et al.  Affective States and State Tests: Investigating How Affect and Engagement during the School Year Predict End-of-Year Learning Outcomes , 2014, J. Learn. Anal..

[13]  Cristina Conati,et al.  Predicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System , 2014, Intelligent Tutoring Systems.

[14]  Ryan Shaun Joazeiro de Baker,et al.  Sequences of Frustration and Confusion, and Learning , 2013, EDM.