Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration

For technology (like serious games) that aims to deliver interactive learning, it is important to address relevant mental experiences such as reflective thinking during problem solving. To facilitate research in this direction, we present the weDraw-1 Movement Dataset of body movement sensor data and reflective thinking labels for 26 children solving mathematical problems in unconstrained settings where the body (full or parts) was required to explore these problems. Further, we provide qualitative analysis of behaviours that observers used in identifying reflective thinking moments in these sessions. The body movement cues from our compilation informed features that lead to average F1 score of 0.73 for automatic detection of reflective thinking based on Long Short-Term Memory neural networks. We further obtained 0.79 average F1 score for end-to-end detection of reflective thinking periods, i.e. based on raw sensor data. Finally, the algorithms resulted in 0.64 average F1 score for period subsegments as short as 4 seconds. Overall, our results show the possibility of detecting reflective thinking moments from body movement behaviours of a child exploring mathematical concepts bodily, such as within serious game play.

[1]  Thomas Plötz,et al.  Deep, Convolutional, and Recurrent Models for Human Activity Recognition Using Wearables , 2016, IJCAI.

[2]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[3]  Jürgen Schmidhuber,et al.  Learning to Forget: Continual Prediction with LSTM , 2000, Neural Computation.

[4]  Maurizio Mancini,et al.  Gesture mimicry in expression of laughter , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[5]  Takeaki Uno,et al.  Student mental state inference from unintentional body gestures using dynamic Bayesian networks , 2010, Journal on Multimodal User Interfaces.

[6]  George Trigeorgis,et al.  Adieu features? End-to-end speech emotion recognition using a deep convolutional recurrent network , 2016, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[7]  Mehrtash Tafazzoli Harandi,et al.  Going deeper into action recognition: A survey , 2016, Image Vis. Comput..

[8]  Hwee Pink Tan,et al.  Deep Activity Recognition Models with Triaxial Accelerometers , 2015, AAAI Workshop: Artificial Intelligence Applied to Assistive Technologies and Smart Environments.

[9]  Maja Pantic,et al.  The automatic detection of chronic pain-related expression : requirements , challenges and a multimodal dataset , 2014 .

[10]  B. Matthews Comparison of the predicted and observed secondary structure of T4 phage lysozyme. , 1975, Biochimica et biophysica acta.

[11]  J A Swets,et al.  Measuring the accuracy of diagnostic systems. , 1988, Science.

[12]  B. Gelder Towards the neurobiology of emotional body language , 2006, Nature Reviews Neuroscience.

[13]  Carol R. Rodgers,et al.  Defining Reflection: Another Look at John Dewey and Reflective Thinking , 2002, Teachers College Record: The Voice of Scholarship in Education.

[14]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[15]  Ming Shao,et al.  A Multi-stream Bi-directional Recurrent Neural Network for Fine-Grained Action Detection , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[16]  V. Braun,et al.  Using thematic analysis in psychology , 2006 .

[17]  J. Mezirow Transformative Dimensions of Adult Learning , 1991 .

[18]  Paul Lukowicz,et al.  Wearable Activity Tracking in Car Manufacturing , 2008, IEEE Pervasive Computing.

[19]  B. Gelder,et al.  Why bodies? Twelve reasons for including bodily expressions in affective neuroscience , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[20]  D. Kolb Experiential Learning: Experience as the Source of Learning and Development , 1983 .

[21]  Louisa Pragst,et al.  A Multimodal Annotation Schema for Non-Verbal Affective Analysis in the Health-Care Domain , 2016, MARMI@ICMR.

[22]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[23]  Kristín Ađalsteinsdóttir,et al.  Teachers' behaviour and practices in the classroom , 2004 .

[24]  Jonathan Back,et al.  Recovering from an interruption: investigating speed-accuracy trade-offs in task resumption behavior. , 2013, Journal of experimental psychology. Applied.

[25]  Thomas Plötz,et al.  Ensembles of Deep LSTM Learners for Activity Recognition using Wearables , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[26]  Monica Gori,et al.  What cognitive and affective states should technology monitor to support learning? , 2017, MIE@ICMI.

[27]  J I Navarro,et al.  Relationship of Arithmetic Problem Solving and Reflective—Impulsive Cognitive Styles in Third-Grade Students , 1999, Psychological reports.

[28]  Susan M. Wagner,et al.  Explaining Math: Gesturing Lightens the Load , 2001, Psychological science.

[29]  Jenni Ingram,et al.  Turn taking and ‘wait time’ in classroom interactions , 2014 .

[30]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[31]  Stephen V. Stehman,et al.  Selecting and interpreting measures of thematic classification accuracy , 1997 .

[32]  D. Cicchetti Guidelines, Criteria, and Rules of Thumb for Evaluating Normed and Standardized Assessment Instruments in Psychology. , 1994 .

[33]  Yann LeCun,et al.  Generalization and network design strategies , 1989 .

[34]  Monica Gori,et al.  Developing a pedagogical framework for designing a multisensory serious gaming environment , 2017, MIE@ICMI.

[35]  Mijung Kim,et al.  CHILDREN’S GESTURES AND THE EMBODIED KNOWLEDGE OF GEOMETRY , 2011 .

[36]  J. Dewey How we think : a restatement of the relation of reflective thinking to the educative process , 1934 .

[37]  Nicolai Marquardt,et al.  Human Observer and Automatic Assessment of Movement Related Self-Efficacy in Chronic Pain: From Exercise to Functional Activity , 2020, IEEE Transactions on Affective Computing.

[38]  Hedda Lausberg,et al.  Methods in Gesture Research: , 2009 .

[39]  Luc Van Gool,et al.  Two-Stream SR-CNNs for Action Recognition in Videos , 2016, BMVC.

[40]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[41]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[42]  Maja Pantic,et al.  The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset , 2016, IEEE Transactions on Affective Computing.

[43]  Jaehong Kim,et al.  Automatic Recognition of Children Engagement from Facial Video Using Convolutional Neural Networks , 2020, IEEE Transactions on Affective Computing.

[44]  Karl S. Rosengren,et al.  A New Look at Children's Private Speech: The Effects of Age, Task Difficulty, and Parent Presence , 1989 .