How to Complement Learning Analytics with Smartwatches?: Fusing Physical Activities, Environmental Context, and Learning Activities

To obtain a holistic perspective on learning, a multimodal technical infrastructure for Learning Analytics (LA) can be beneficial. Recent studies have investigated various aspects of technical LA infrastructure. However, it has not yet been explored how LA indicators can be complemented with Smartwatch sensor data to detect physical activity and the environmental context. Sensor data, such as the accelerometer, are often used in related work to infer a specific behavior and environmental context, thus triggering interventions on a just-in-time basis. In this dissertation project, we plan to use Smartwatch sensor data to explore further indicators for learning from blended learning sessions conducted in-the-wild, e.g., at home. Such indicators could be used within learning sessions to suggest breaks, or afterward to support learners in reflection processes. We plan to investigate the following three research questions: (RQ1) How can multimodal learning analytics infrastructure be designed to support real-time data acquisition and processing effectively?; (RQ2) how to use smartwatch sensor data to infer environmental context and physical activities to complement learning analytics indicators for blended learning sessions; and (RQ3) how can we align the extracted multimodal indicators with pedagogical interventions. RQ1 was investigated by a structured literature review and by conducting eleven semi-structured interviews with LA infrastructure developers. According to RQ2, we are currently designing and implementing a multimodal learning analytics infrastructure to collect and process sensor and experience data from Smartwatches. Finally, according to RQ3, an exploratory field study will be conducted to extract multimodal learning indicators and examine them with learners and pedagogical experts to develop effective interventions. Researchers, educators, and learners can use and adapt our contributions to gain new insights into learners' time and learning tactics, and physical learning spaces from learning sessions taking place in-the-wild.

[1]  U. Ebner-Priemer,et al.  Fostering self-regulation to overcome academic procrastination using interactive ambulatory assessment , 2019, Learning and Individual Differences.

[2]  Deepak Ganesan,et al.  On-body Sensing of Cocaine Craving, Euphoria and Drug-Seeking Behavior Using Cardiac and Respiratory Signals , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[3]  Hendrik Drachsler,et al.  From signals to knowledge: A conceptual model for multimodal learning analytics , 2018, J. Comput. Assist. Learn..

[4]  A. Dey,et al.  Mobile phone sensors and supervised machine learning to identify alcohol use events in young adults: Implications for just-in-time adaptive interventions. , 2017, Addictive behaviors.

[5]  Ambuj Tewari,et al.  Just-in-Time Adaptive Interventions (JITAIs) in Mobile Health: Key Components and Design Principles for Ongoing Health Behavior Support , 2017, Annals of behavioral medicine : a publication of the Society of Behavioral Medicine.

[6]  Hendrik Drachsler,et al.  Which Strategies are Used in the Design of Technical LA Infrastructure?: A Qualitative Interview Study , 2020, 2020 IEEE Global Engineering Education Conference (EDUCON).

[7]  Xavier Ochoa,et al.  Editorial: Augmenting Learning Analytics with Multimodal Sensory Data​ , 2016, J. Learn. Anal..

[8]  Paul Johns,et al.  Predicting "About-to-Eat" Moments for Just-in-Time Eating Intervention , 2016, Digital Health.

[9]  Maren Scheffel,et al.  Multimodal Analytics for Real-Time Feedback in Co-located Collaboration , 2018, EC-TEL.

[10]  Stefaan Ternier,et al.  Adopting Trust in Learning Analytics Infrastructure: A Structured Literature Review , 2019, J. Univers. Comput. Sci..

[11]  Ben Kei Daniel,et al.  Big Data and Learning Analytics in Higher Education , 2017 .

[12]  Gierad Laput,et al.  Sensing Fine-Grained Hand Activity with Smartwatches , 2019, CHI.

[13]  Hendrik Drachsler,et al.  Read Between the Lines: An Annotation Tool for Multimodal Data for Learning , 2019, LAK.

[14]  Aditya Ponnada,et al.  μEMA: Microinteraction-based ecological momentary assessment (EMA) using a smartwatch , 2016, UbiComp.

[15]  Carolyn Penstein Rosé,et al.  Guest Editorial: Special Section on Learning Analytics , 2017, IEEE Trans. Learn. Technol..

[16]  Abayomi Moradeyo Otebolaku,et al.  User context recognition using smartphone sensors and classification models , 2016, J. Netw. Comput. Appl..

[17]  Henry Khiat,et al.  Using automated time management enablers to improve self-regulated learning , 2019, Active Learning in Higher Education.