(Dis)engagement matters: identifying efficacious learning practices with multimodal learning analytics

Video analysis is a staple of the education research community. For many contemporary education researchers, participation in the video coding process serves as a rite of passage. However, recent developments in multimodal learning analytics may help to accelerate and enhance this process by providing researchers with a more nuanced glimpse into a set of learning experiences. As an example of how to use multimodal learning analytics towards these ends, this paper includes a preliminary analysis from 54 college students, who completed two engineering design tasks in pairs. Gesture, speech and electro-dermal activation data were collected as students completed these tasks. The gesture data was used to learn a set of canonical clusters (N=4). A decision tree was trained based on individual students' cluster frequencies, and pre-post learning gains. The nodes in the decision tree were then used to identify a subset of video segments that were human coded based on prior work in learning analytics and engineering design. The combination of machine learning and human inference helps elucidate the practices that seem to correlate with student learning. In particular, both engagement and disengagement seem to correlate with student learning, albeit in a somewhat nuanced fashion.

[1]  Marcelo Worsley,et al.  Leveraging multimodal learning analytics to differentiate student learning strategies , 2015, LAK.

[2]  Marcelo Worsley,et al.  Reasoning Strategies in the Context of Engineering Design with Everyday Materials. , 2017 .

[3]  Marcelo Worsley,et al.  A Multimodal Analysis of Making , 2017, International Journal of Artificial Intelligence in Education.

[4]  Sharon L. Oviatt,et al.  Written and multimodal representations as predictors of expertise and problem-solving success in mathematics , 2013, ICMI '13.

[5]  Bertrand Schneider,et al.  Unraveling Students’ Interaction Around a Tangible Interface using Multimodal Learning Analytics , 2015, EDM 2015.

[6]  Colleen Richey,et al.  The SRI Speech-Based Collaborative Learning Corpus , 2016, INTERSPEECH.

[7]  Colleen Richey,et al.  Privacy-Preserving Speech Analytics for Automatic Assessment of Student Collaboration , 2016, INTERSPEECH.

[8]  R. Núñez,et al.  Embodied cognition as grounding for situatedness and context in mathematics education , 1999 .

[9]  Marcelo Worsley,et al.  Analyzing Engineering Design through the Lens of Computation , 2014, J. Learn. Anal..

[10]  Lei Chen,et al.  Utilizing Depth Sensors for Analyzing Multimodal Presentations: Hardware, Software and Toolkits , 2015, ICMI.

[11]  Eliot R. Smith,et al.  Socially Situated Cognition: Cognition in its Social Context , 2004 .

[12]  Roy D. Pea,et al.  Advancing Understanding of Collaborative Learning with Data Derived from Video Records , 2013 .

[13]  Steven E. Higgins,et al.  Different leaders: Emergent organizational and intellectual leadership in children’s collaborative learning groups , 2014, Int. J. Comput. Support. Collab. Learn..

[14]  G. Hatano,et al.  TWO COURSES OF EXPERTISE , 1984 .

[15]  J. Roschelle Learning by Collaborating: Convergent Conceptual Change , 1992 .

[16]  Marcelo Worsley,et al.  Exploring Behavior Representation for Learning Analytics , 2015, ICMI.

[17]  Joseph F. Grafsgaard Multimodal Analysis and Modeling of Nonverbal Behaviors during Tutoring , 2014, ICMI.

[18]  Susan Goldin-Meadow,et al.  Action’s Influence on Thought: The Case of Gesture , 2010, Perspectives on psychological science : a journal of the Association for Psychological Science.

[19]  Xavier Ochoa,et al.  Editorial: Augmenting Learning Analytics with Multimodal Sensory Data​ , 2016, J. Learn. Anal..

[20]  Marcelo Worsley,et al.  Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex learning tasks , 2016, J. Learn. Anal..

[21]  David Hammer,et al.  Student Behavior and Epistemological Framing: Examples from Collaborative Active-Learning Activities in Physics , 2007, ICLS.

[22]  Dor Abrahamson,et al.  The mathematical imagery trainer: from embodied interaction to conceptual learning , 2011, CHI.

[23]  C. Hmelo‐Silver,et al.  Facilitating Collaborative Knowledge Building , 2008 .

[24]  Marcelo Worsley,et al.  Towards the development of multimodal action based assessment , 2013, LAK '13.

[25]  Marcelo Worsley The Engineering Design with Everyday Materials Multi-modal Dataset , 2017, MMLA-CrossLAK@LAK.

[26]  Bertrand Schneider,et al.  Using Mobile Eye-Trackers to Unpack the Perceptual Benefits of a Tangible User Interface for Collaborative Learning , 2016, ACM Trans. Comput. Hum. Interact..

[27]  Xavier Ochoa,et al.  MLA'14: Third Multimodal Learning Analytics Workshop and Grand Challenges , 2014, ICMI.

[28]  Bena Kallick,et al.  Getting into the Habit of Reflection. , 2000 .

[29]  Xavier Ochoa,et al.  Expertise estimation based on simple multimodal features , 2013, ICMI '13.