From signals to knowledge: A conceptual model for multimodal learning analytics

Multimodality in learning analytics and learning science is under the spotlight. The landscape of sensors and wearable trackers that can be used for learning support is evolving rapidly, as well as data collection and analysis methods. Multimodal data can now be collected and processed in real time at an unprecedented scale. With sensors, it is possible to capture observable events of the learning process such as learner's behaviour and the learning context. The learning process, however, consists also of latent attributes, such as the learner's cognitions or emotions. These attributes are unobservable to sensors and need to be elicited by human‐driven interpretations. We conducted a literature survey of experiments using multimodal data to frame the young research field of multimodal learning analytics. The survey explored the multimodal data used in related studies (the input space) and the learning theories selected (the hypothesis space). The survey led to the formulation of the Multimodal Learning Analytics Model whose main objectives are of (O1) mapping the use of multimodal data to enhance the feedback in a learning context; (O2) showing how to combine machine learning with multimodal data; and (O3) aligning the terminology used in the field of machine learning and learning science

[1]  Xavier Ochoa,et al.  Multimodal Selfies: Designing a Multimodal Recording Device for Students in Traditional Classrooms , 2015, ICMI.

[2]  S. D’Mello A selective meta-analysis on the relative incidence of discrete affective states during learning with technology , 2013 .

[3]  John R. Anderson Spanning seven orders of magnitude: a challenge for cognitive modeling , 2002 .

[4]  Sanna Järvelä,et al.  Profiling sympathetic arousal in a physics course: How active are students? , 2018, J. Comput. Assist. Learn..

[5]  Marcelo Worsley,et al.  Towards the development of multimodal action based assessment , 2013, LAK '13.

[6]  Lung-Hsiang Wong,et al.  A learner-centric view of mobile seamless learning , 2012, Br. J. Educ. Technol..

[7]  N. Ambady,et al.  Half a minute: Predicting teacher evaluations from thin slices of nonverbal behavior and physical attractiveness. , 1993 .

[8]  J. Thayer,et al.  Heart Rate Variability, Prefrontal Neural Function, and Cognitive Performance: The Neurovisceral Integration Perspective on Self-regulation, Adaptation, and Health , 2009, Annals of behavioral medicine : a publication of the Society of Behavioral Medicine.

[9]  Christian Jutten,et al.  Multimodal Data Fusion: An Overview of Methods, Challenges, and Prospects , 2015, Proceedings of the IEEE.

[10]  Lei Chen,et al.  Utilizing Depth Sensors for Analyzing Multimodal Presentations: Hardware, Software and Toolkits , 2015, ICMI.

[11]  Marcus Specht,et al.  Personalization and Context Management , 2005, User Modeling and User-Adapted Interaction.

[12]  A. Adamantidis,et al.  Shining Light on Wakefulness and Arousal , 2012, Biological Psychiatry.

[13]  Peter van Rosmalen,et al.  Presentation Trainer, your Public Speaking Multimodal Coach , 2015, ICMI.

[14]  R. Bjork,et al.  Self-regulated learning: beliefs, techniques, and illusions. , 2013, Annual review of psychology.

[15]  Xavier Ochoa,et al.  Presentation Skills Estimation Based on Video and Kinect Data Analysis , 2014, MLA@ICMI.

[16]  Paulo Blikstein,et al.  Process Pad: a multimedia multi-touch learning platform , 2011, ITS '11.

[17]  Joëlle Coutaz,et al.  A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.

[18]  Hendrik Drachsler,et al.  Translating Learning into Numbers: A Generic Framework for Learning Analytics , 2012, J. Educ. Technol. Soc..

[19]  Jerome H. Friedman,et al.  On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality , 2004, Data Mining and Knowledge Discovery.

[20]  Kiavash Bahreini,et al.  Improved Multimodal Emotion Recognition for Better Game-Based Learning , 2014, GALA.

[21]  Rafael A. Calvo,et al.  Categorical vs. Dimensional Representations in Multimodal Affect Detection during Learning , 2012, ITS.

[22]  Xavier Ochoa,et al.  Editorial: Augmenting Learning Analytics with Multimodal Sensory Data​ , 2016, J. Learn. Anal..

[23]  J. Hattie,et al.  The Power of Feedback , 2007 .

[24]  Peter van Rosmalen,et al.  Augmenting the Senses: A Review on Sensor-Based Learning Support , 2015, Sensors.

[25]  Andrew Olney,et al.  Multimodal Capture of Teacher-Student Interactions for Automated Dialogic Analysis in Live Classrooms , 2015, ICMI.

[26]  Alejandro Andrade,et al.  Understanding student learning trajectories using multimodal learning analytics within an embodied-interaction learning environment , 2017, LAK.

[27]  Kristy Elizabeth Boyer,et al.  Predicting Learning and Affect from Multimodal Data Streams in Task-Oriented Tutorial Dialogue , 2014, EDM.

[28]  Rafael A. Calvo,et al.  Detecting Naturalistic Expressions of Nonbasic Affect Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[29]  J. Sweller,et al.  Cognitive Load Theory and Complex Learning: Recent Developments and Future Directions , 2005 .

[30]  Charles E. Hughes,et al.  Providing Real-time Feedback for Student Teachers in a Virtual Rehearsal Environment , 2015, ICMI.

[31]  Paulo Blikstein,et al.  Multimodal learning analytics , 2013, LAK '13.

[32]  Sanna Järvelä,et al.  Investigating collaborative learning success with physiological coupling indices based on electrodermal activity , 2016, LAK.

[33]  Nadir Weibel,et al.  Multimodal learning analytics: description of math data corpus for ICMI grand challenge workshop , 2013, ICMI '13.

[34]  Bert Arnrich,et al.  Semi-supervised model personalization for improved detection of learner's emotional engagement , 2016, ICMI.

[35]  Stefaan Ternier,et al.  Learning pulse: a machine learning approach for predicting performance in self-regulated learning using multimodal data , 2017, LAK.

[36]  J. Gratch,et al.  The Oxford Handbook of Affective Computing , 2014 .

[37]  Anthony Collins,et al.  Who did what? Who said that?: Collaid: an environment for capturing traces of collaborative learning at the tabletop , 2011, ITS '11.

[38]  Cecilia Ovesdotter Alm,et al.  Sensor-based Methodological Observations for Studying Online Learning , 2017, SmartLearn@IUI.

[39]  M. Boekaerts,et al.  The crucial role of motivation and emotion in classroom learning , 2010 .

[40]  Xavier Ochoa,et al.  Expertise estimation based on simple multimodal features , 2013, ICMI '13.

[41]  Nigel Bosch Multimodal Affect Detection in the Wild: Accuracy, Availability, and Generalizability , 2015, ICMI.

[42]  Isabella Poggi,et al.  Social signals: from theory to applications , 2012, Cognitive Processing.

[43]  T. Ostrom The relationship between the affective, behavioral, and cognitive components of attitude. , 1969 .

[44]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[45]  Pierre Dillenbourg,et al.  The Evolution of Research on Digital Education , 2016, International Journal of Artificial Intelligence in Education.