Riding an emotional roller-coaster: A multimodal study of young child's math problem solving activities

Solving challenging math problems often invites a child to ride an “emotional roller-coaster” and experience a complex mixture of emotions including confusion, frustration, joy, and surprise. Early exposure to this type of “hard fun” may stimulate child’s interest and curiosity of mathematics and nurture life long skills such as resilience and perseverance. However, without optimal support, it may also turn off child prematurely due to unresolved frustration. An ideal teacher is able to pick up child’s subtle emotional signals in real time and respond optimally to offer cognitive and emotional support. In order to design an intelligent tutor specifically designed for this purpose, it is necessary to understand at fine-grained level the child’s emotion experience and its interplay with the inter-personal communication dynamics between child and his/her teacher. In this study, we made such an attempt by analyzing a series of video recordings of problem solving sessions by a young student and his mom, the ideal teacher. We demonstrate a multimodal analysis framework to characterize several aspects of the child-mom interaction patterns within the emotional context at a granular level. We then build machine learning models to predict teacher’s response using extracted multimodal features. In addition, we validate the performance of automatic detector of affect, intent-to-connect behavior, and voice activity, using annotated data, which provides evidence of the potential utility of the presented tools in scaling up analysis of this type to large number of subjects and in implementing tools to guide teachers towards optimal interactions in real time. ∗(Does NOT produce the permission block, copyright information nor page numbering). For use with ACM PROC ARTICLE-SP.CLS. Supported by ACM.

[1]  Javier R. Movellan,et al.  The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions , 2014, IEEE Transactions on Affective Computing.

[2]  Rosalind W. Picard,et al.  Towards a Learning Companion that Recognizes Affect , 2001 .

[3]  Eugene S. Edgington,et al.  Randomization Tests , 2011, International Encyclopedia of Statistical Science.

[4]  Mieke Brekelmans,et al.  Real-time teacher-student interactions : a dynamic systems approach , 2014 .

[5]  Joseph E. Beck,et al.  Engagement tracing: using response times to model student disengagement , 2005, AIED.

[6]  Brandon G. King,et al.  Facial Features for Affective State Detection in Learning Environments , 2007 .

[7]  Claude Frasson,et al.  A Hierarchical Probabilistic Framework for Recognizing Learners' Interaction Experience Trends and Emotions , 2014, Adv. Hum. Comput. Interact..

[8]  C Sadler,et al.  Good question? , 1991, Nursing times.

[9]  John Kane,et al.  COVAREP — A collaborative voice analysis repository for speech technologies , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[10]  Arthur C. Graesser,et al.  Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features , 2010, User Modeling and User-Adapted Interaction.

[11]  Peter Robinson,et al.  OpenFace: An open source facial behavior analysis toolkit , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[12]  Hennie Brugman,et al.  Annotating Multi-media/Multi-modal Resources with ELAN , 2004, LREC.

[13]  C. W. Tate Solve it. , 2005, Nursing standard (Royal College of Nursing (Great Britain) : 1987).

[14]  Diane J. Litman,et al.  Predicting Emotion in Spoken Dialogue from Multiple Knowledge Sources , 2004, NAACL.

[15]  Angela L. Duckworth,et al.  Grit: perseverance and passion for long-term goals. , 2007, Journal of personality and social psychology.