The Proof of the Pudding: Examining Validity and Reliability of the Evaluation Framework for Learning Analytics

While learning analytics (LA) is maturing from being a trend to being part of the institutional toolbox, the need for more empirical evidences about the effects for LA on the actual stakeholders, i.e. learners and teachers, is increasing. Within this paper we report about a further evaluation iteration of the Evaluation Framework for Learning Analytics (EFLA) that provides an efficient and effective measure to get insights into the application of LA in educational institutes. For this empirical study we have thus developed and implemented several LA widgets into a MOOC platform’s dashboard and evaluated these widgets using the EFLA as well as the framework itself using principal component and reliability analysis. The results show that the EFLA is able to measure differences between widget versions. Furthermore, they indicate that the framework is highly reliable after slightly adapting its dimensions.

[1]  Maren Scheffel,et al.  Quality Indicators for Learning Analytics , 2014, J. Educ. Technol. Soc..

[2]  Maren Scheffel,et al.  Widget, widget as you lead, I am performing well indeed!: using results from an exploratory offline study to inform an empirical online study about a learning analytics widget in a collaborative learning environment , 2017, LAK.

[3]  Erik Duval,et al.  Learning dashboards: an overview and future research opportunities , 2013, Personal and Ubiquitous Computing.

[4]  D. Schoen,et al.  The Reflective Practitioner: How Professionals Think in Action , 1985 .

[5]  George Siemens,et al.  Penetrating the fog: analytics in learning and education , 2014 .

[6]  Rebecca Ferguson,et al.  Learning analytics community exchange: evidence hub , 2016, LAK.

[7]  Hendrik Drachsler,et al.  Translating Learning into Numbers: A Generic Framework for Learning Analytics , 2012, J. Educ. Technol. Soc..

[8]  Marek Hatala,et al.  The role of achievement goal orientations when studying effect of learning analytics visualizations , 2016, LAK.

[9]  Steven Lonn,et al.  An exercise in institutional reflection: the learning analytics readiness instrument (LARI) , 2014, LAK.

[10]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[11]  Maren Scheffel,et al.  Developing an evaluation framework of quality indicators for learning analytics , 2015, LAK.

[12]  Angel Cobo,et al.  Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining , 2014, Behav. Inf. Technol..

[13]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[14]  R. J. Bogumil,et al.  The reflective practitioner: How professionals think in action , 1985, Proceedings of the IEEE.

[15]  P. Winne,et al.  Feedback and Self-Regulated Learning: A Theoretical Synthesis , 1995 .

[16]  Abelardo Pardo,et al.  Data2U: scalable real time student feedback in active learning environments , 2016, LAK.

[17]  Dragan Gasevic,et al.  Learning Analytics – A Growing Field and Community Engagement , 2015 .

[18]  George Siemens,et al.  Let’s not forget: Learning analytics are about learning , 2015 .

[19]  Marco Kalz,et al.  The MOOC and learning analytics innovation cycle (MOLAC): a reflective summary of ongoing research and its challenges , 2016, J. Comput. Assist. Learn..

[20]  Il-Hyun Jo,et al.  Effects of learning analytics dashboard: analyzing the relations among dashboard utilization, satisfaction, and learning achievement , 2015, Asia Pacific Education Review.

[21]  Donatella Persico,et al.  Informing learning design with learning analytics to improve teacher inquiry , 2015, Br. J. Educ. Technol..

[22]  Steven Lonn,et al.  Investigating student motivation in the context of a learning analytics intervention during a summer bridge program , 2015, Comput. Hum. Behav..