Towards emotionally-personalized computing: Dynamic prediction of student mental states from self-manipulatory body movements

An emotionally-personalized computer that could empathize a student, learning through a tutorial or a software program, would be an excellent application of affective computing. Towards development of this potentially beneficial technology, we describe two related evaluations of a student mental state prediction model that not only predicts student's mental state from his/her visually observable behavior but also detects his/her personality. In the first set of evaluations, we model the assumed cause-effect relationships between student's mental states and the body gestures using a two-layered dynamic Bayesian network (DBN). We used the data obtained earlier from four students, in a highly-contextualized interaction, i.e. students attending a classroom lecture. We train and test this DBN using data from each individual student. A maximum a posteriori classifier based on the DBN model gives an average accuracy of 87.6% over all four individual student cases. In the second set of evaluations, we extend the model to a three-layered DBN by including the personality attribute in the network, and then, we train the network using data from all four students. At test time, the network successfully detects the personality of each test student. The results demonstrate the feasibility of our approach.

[1]  J. Sousa-poza,et al.  BODY MOVEMENT IN RELATION TO TYPE OF INFORMATION (PERSON‐ AND NONPERSON‐ORIENTED) AND COGNITIVE STYLE (FIELD DEPENDENCE)1 , 1977 .

[2]  Takeaki Uno,et al.  Towards Knowledge-Based Affective Interaction: Situational Interpretation of Affect , 2007, ACII.

[3]  Diane J. Litman,et al.  Towards Emotion Prediction in Spoken Tutoring Dialogues , 2003, HLT-NAACL.

[4]  Keiji Kanazawa,et al.  A model for reasoning about persistence and causation , 1989 .

[5]  Anne Miller,et al.  Video-Cued Recall: Its use in a Work Domain Analysis , 2004 .

[6]  Rosalind W. Picard Affective Computing , 1997 .

[7]  S. Mitra,et al.  Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  Andrew N. Kenner A cross-cultural study of body-focused hand movement , 1993 .

[9]  Ana Paiva,et al.  FearNot! - An Experiment in Emergent Narrative , 2005, IVA.

[10]  Takeaki Uno,et al.  Probabilistic Prediction of Student Affect from Hand Gestures , 2008, ARCS.

[11]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[12]  Maja Pantic,et al.  Social signal processing: Survey of an emerging domain , 2009, Image Vis. Comput..

[13]  Abdolhossein Sarrafzadeh,et al.  Making sense of student use of nonverbal cues for intelligent tutoring systems , 2005, OZCHI.

[14]  Peter Robinson,et al.  Generalization of a Vision-Based Computational Model of Mind-Reading , 2005, ACII.

[15]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[16]  James C. Lester,et al.  Deictic Believability: Coordinated Gesture, Locomotion, and Speech in Lifelike Pedagogical Agents , 1999, Appl. Artif. Intell..

[17]  K. Scherer What are emotions? And how can they be measured? , 2005 .