Modelling collaborative problem-solving competence with transparent learning analytics: is video data enough?

In this study, we describe the results of our research to model collaborative problem-solving (CPS) competence based on analytics generated from video data. We have collected ~500 mins video data from 15 groups of 3 students working to solve design problems collaboratively. Initially, with the help of OpenPose, we automatically generated frequency metrics such as the number of the face-in-the-screen; and distance metrics such as the distance between bodies. Based on these metrics, we built decision trees to predict students' listening, watching, making, and speaking behaviours as well as predicting the students' CPS competence. Our results provide useful decision rules mined from analytics of video data which can be used to inform teacher dashboards. Although, the accuracy and recall values of the models built are inferior to previous machine learning work that utilizes multimodal data, the transparent nature of the decision trees provides opportunities for explainable analytics for teachers and learners. This can lead to more agency of teachers and learners, therefore can lead to easier adoption. We conclude the paper with a discussion on the value and limitations of our approach.

[1]  Heather Pon-Barry,et al.  Acoustic-Prosodic Entrainment and Rapport in Collaborative Learning Dialogues , 2014, MLA@ICMI.

[2]  Justin Dauwels,et al.  Technologies for automated analysis of co-located, real-life, physical learning spaces: Where are we now? , 2019, LAK.

[3]  Hong Lu,et al.  Teaching Video Analytics Based on Student Spatial and Temporal Behavior Mining , 2015, ICMR.

[4]  D. Poeppel,et al.  Brain-to-Brain Synchrony Tracks Real-World Dynamic Group Interactions in the Classroom , 2017, Current Biology.

[5]  M. Cukurova,et al.  Students’ knowledge acquisition and ability to apply knowledge into different science contexts in two different independent learning settings , 2018 .

[6]  Rosemary Luckin,et al.  The NISPI framework: Analysing collaborative problem-solving from students' physical interactions , 2018, Comput. Educ..

[7]  Mutlu Cukurova,et al.  Investigating Collaboration as a Process with Theory-driven Learning Analytics , 2020, J. Learn. Anal..

[8]  Catherine Creech,et al.  Classroom sound can be used to classify teaching practices in college science courses , 2017, Proceedings of the National Academy of Sciences.

[9]  Pierre Dillenbourg,et al.  Translating Head Motion into Attention - Towards Processing of Student's Body-Language , 2015, EDM.

[10]  Paulo Blikstein,et al.  Multimodal learning analytics , 2013, LAK '13.

[11]  Paulo Blikstein,et al.  Using learning analytics to assess students' behavior in open-ended programming tasks , 2011, LAK.

[12]  Olga C. Santos,et al.  Physical learning analytics: a multimodal perspective , 2018, LAK.

[13]  Ryan Shaun Joazeiro de Baker,et al.  Stupid Tutoring Systems, Intelligent Humans , 2016, International Journal of Artificial Intelligence in Education.

[14]  Shuchi Grover,et al.  Multimodal analytics to study collaborative problem solving in pair programming , 2016, LAK.

[15]  Rosemary Luckin,et al.  An analysis framework for collaborative problem solving in practice-based learning activities: a mixed-method approach , 2016, LAK.

[16]  Roberto Martínez Maldonado,et al.  Where Is the Nurse? Towards Automatically Visualising Meaningful Team Movement in Healthcare Education , 2018, AIED.

[17]  Roberto Martínez Maldonado,et al.  Analytics meet patient manikins: challenges in an authentic small-group healthcare simulation classroom , 2017, LAK.

[18]  Jeremy N. Bailenson,et al.  Automatic Detection of Nonverbal Behavior Predicts Learning in Dyadic Interactions , 2014, IEEE Transactions on Affective Computing.

[19]  Rosemary Luckin,et al.  Artificial intelligence and multimodal data in the service of human decision-making: A case study in debate tutoring , 2019, Br. J. Educ. Technol..

[20]  Bertrand Schneider,et al.  Unraveling Students' Interaction Around a Tangible Interface Using Gesture Recognition , 2014, EDM.

[21]  Marcelo Worsley,et al.  (Dis)engagement matters: identifying efficacious learning practices with multimodal learning analytics , 2018, LAK.

[22]  Haohan Li,et al.  Who missed the class? — Unifying multi-face detection, tracking and recognition in videos , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[23]  Sharon L. Oviatt,et al.  Written and multimodal representations as predictors of expertise and problem-solving success in mathematics , 2013, ICMI '13.

[24]  Sharon L. Oviatt,et al.  Written Activity, Representations and Fluency as Predictors of Domain Expertise in Mathematics , 2014, ICMI.

[25]  Xavier Ochoa,et al.  Expertise estimation based on simple multimodal features , 2013, ICMI '13.

[26]  Bertrand Schneider,et al.  Detecting Collaborative Dynamics Using Mobile Eye-Trackers , 2016, ICLS.

[27]  John Sweller,et al.  From Cognitive Load Theory to Collaborative Cognitive Load Theory , 2018, International Journal of Computer-Supported Collaborative Learning.

[28]  Mutlu Cukurova,et al.  Supervised machine learning in multimodal learning analytics for estimating success in project-based learning , 2018, J. Comput. Assist. Learn..

[29]  Fang Chen,et al.  Spoken Interruptions Signal Productive Problem Solving and Domain Expertise in Mathematics , 2015, ICMI.

[30]  Michael C. Mozer,et al.  How Deep is Knowledge Tracing? , 2016, EDM.

[31]  Sharon L. Oviatt Problem solving, domain expertise and learning: ground-truth performance results for math data corpus , 2013, ICMI '13.