Improving Learning Analytics - Combining Observational and Self-Report Data on Student Learning

Introduction Using learning analytics as a tool to improve student learning has caught the imagination and research effort of much of the higher education sector (Siemens, 2013). Amongst a number of applications, it notably has been used to improve student success (Arnold, Hall, Street, Lafayette, & Pistilli, 2012; Martin et al., 2013), to better understand the nature of social learning amongst university students (Buckingham Shum & Ferguson, 2012), to improve approaches to learning design (Mor, Ferguson, & Wasson, 2015), and to guide university education strategy (Rientes et al., 2016). Accompanying this growing use of learning analytics, there is serious debate about the extent to which they are useful as a tool for improving student learning (Lodge & Lewis, 2012; Lundie, 2014). One debate is about the objectivity of learning analytics; some argue that learning analytics are an objective measure of student activity, but others suggest that without understanding student intent behind the analytics, we have a poor context in which to interpret what the numbers mean (Boyd & Crawford, 2012). Another debate is that learning analytics tell us what students are doing when they learn in an online environment. Doubters argue that they only tell us what buttons they are clicking (Scheffel, Drachsler, Stoyanov, & Specht, 2014). A further debate surrounds the value of very large data sets. Some argue that the more analytics you have about student learning experiences the better, while others argue that a careful selection of analytics must be made in relation to the population sample, otherwise the additional metrics might just create noise in interpreting their meaning. As some studies suggest, indiscriminate approaches to the use of large datasets could lead to unintended consequences in learning interventions (Boyd & Crawford, 2012; Greller & Drachsler, 2012). To remedy some of the perceived shortfalls of learning analytics, some authors argue that the learning analytics should occupy a middle space, somewhere between learning theory and computational measurement, to improve the potential of learning analytics to really address concerns of the quality of student learning (Suthers & Vebert, 2013). To achieve this, they recommend that additional analytic techqniues accompany learning analytic procedures from such fields as epistomology and education studies. To investigate methodological approaches to address some of the perceived shortfalls of learning analytics, this study investigates the first year experience of undergraduate engineering students in a blended course in two stages. In the first stage, it records their learning events in the online environment and analyses and interprets them in the context of their learning outcomes (Pardo, Han, & Ellis, 2016). While illuminating, this analysis alone could be left open to some of the criticisms described above. In the second stage, methodological approaches from Student Approaches to Learning (Pintrich, 2004) are used and the students' response to closed ended questionnaires (Biggs, Kember, & Leung, 2001) about their experience of learning is investigated. The outcomes of this analysis, when complemented by stage 1, both elucidates why some students are relatively more successful than others in the course and provides evidence which suggests why this might be the case. The purpose of this study is to contribute to the international debate on the value of learning analytics for the quality of the student learning experience and how combined methodological approaches using observational and self-report evidence can improve our understanding of qualitative variation in student learning. By drawing on both types of data from the same experience of learning, this study is designed to see to what extent a combined use of the observational and self-report data improves our ability to use learning analytics to understand why some students are more successful than others. …

[1]  Peter Maurer,et al.  The Cambridge Handbook of the Learning Sciences , 2022 .

[2]  Paul Ashwin,et al.  Evoked prior learning experience and approach to learning as predictors of academic achievement. , 2013, The British journal of educational psychology.

[3]  Xavier Ochoa,et al.  Techniques for data-driven curriculum analysis , 2014, LAK.

[4]  Taylor Martin,et al.  Nanogenetic learning analytics: illuminating student learning pathways in an online fraction game , 2013, LAK '13.

[5]  J. Biggs,et al.  The revised two-factor Study Process Questionnaire: R-SPQ-2F. , 2001, The British journal of educational psychology.

[6]  G. Boulton‐Lewis Teaching for quality learning at university , 2008 .

[7]  Maren Scheffel,et al.  Quality Indicators for Learning Analytics , 2014, J. Educ. Technol. Soc..

[8]  F. Marton,et al.  ON QUALITATIVE DIFFERENCES IN LEARNING: I—OUTCOME AND PROCESS* , 1976 .

[9]  Marie Schmidt,et al.  Learning to Teach in Higher Education , 1992 .

[10]  Michael Prosser,et al.  The “How” and “What” of learning physics , 1989 .

[11]  Jason M. Lodge,et al.  Pigeon pecks and mouse clicks: Putting the learning back into learning analytics , 2012 .

[12]  Shane Dawson,et al.  Informing Pedagogical Action , 2013 .

[13]  Elaine Martin,et al.  Dissonance in Experience of Teaching and its Relation to the Quality of Student Learning , 2003 .

[14]  Peter Goodyear,et al.  Learning through face-to-face and online discussions: Associations between students' conceptions, approaches and academic performance in political science , 2010, Br. J. Educ. Technol..

[15]  R. Calvo,et al.  Engineering students' conceptions of and approaches to learning through discussions in face-to-face and online contexts , 2008 .

[16]  E. V. Rossum,et al.  The Relationship between Learning Conception, Study Strategy and Learning Outcome. , 1984 .

[17]  Tara Fenwick,et al.  Sociomateriality and learning: a critical approach , 2015 .

[18]  K. Trigwell,et al.  RELATING APPROACHES TO STUDY AND QUALITY OF LEARNING OUTCOMES AT THE COURSE LEVEL , 1991 .

[19]  Karen Littleton,et al.  Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the Middle Space , 2014, J. Learn. Anal..

[20]  Matthew D. Pistilli,et al.  Course signals at Purdue: using learning analytics to increase student success , 2012, LAK.

[21]  Carlos Delgado Kloos,et al.  Monitoring student progress using virtual appliances: A case study , 2012, Comput. Educ..

[22]  Hanan Ayad,et al.  Student success system: risk analytics and data visualization using ensembles of predictive models , 2012, LAK.

[23]  P. Pintrich A Conceptual Framework for Assessing Motivation and Self-Regulated Learning in College Students , 2004 .

[24]  J. Biggs Student Approaches to Learning and Studying , 1987 .

[25]  Hanan Ayad,et al.  Improving student success using predictive models and data visualisations , 2012 .

[26]  Taylor Martin,et al.  Educational Data Mining: Illuminating Student Learning Pathways in an Online Fraction Game , 2013, EDM.

[27]  Velda McCune,et al.  Investigating ways of enhancing university teaching-learning environments: Measuring students' approaches to studying and perceptions of teaching , 2003 .

[28]  Katrien Verbert,et al.  Learning analytics as a "middle space" , 2013, LAK '13.

[29]  Simon Buckingham Shum,et al.  Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics , 2012, International Conference on Learning Analytics and Knowledge.

[30]  Alfred Joseph Lizzio,et al.  University Students' Perceptions of the Learning Environment and Academic Outcomes: Implications for theory and practice , 2002 .

[31]  Cláudia Antunes,et al.  Anticipating student’s failure as soon as possible , 2009 .

[32]  Jacob Cohen,et al.  A power primer. , 1992, Psychological bulletin.

[33]  Jim Gaston,et al.  Sherpa: increasing student success with a recommendation engine , 2012, LAK '12.

[34]  Abelardo Pardo,et al.  Exploring the relation between self-regulation, online activities, and academic performance: a case study , 2016, LAK.

[35]  Robert A. Ellis,et al.  Relations between students' approaches to learning, experienced emotions and outcomes of learning , 2012 .

[36]  Rebecca Ferguson,et al.  Social Learning Analytics , 2012, J. Educ. Technol. Soc..

[37]  Barbara Wasson,et al.  Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action , 2015, Br. J. Educ. Technol..

[38]  K. Trigwell,et al.  Understanding Learning and Teaching: the experience in higher education , 1999 .

[39]  George Siemens,et al.  Learning Analytics , 2013 .

[40]  B. Rienties,et al.  Analytics 4 Action Evaluation Framework : A Review of Evidence-Based Learning Analytics Interventions at the Open University UK , 2016 .

[41]  Sebastián Ventura,et al.  Data mining in education , 2013, WIREs Data Mining Knowl. Discov..

[42]  Ray Sleet,et al.  IMPROVING THE RELATIONSHIP BETWEEN ASSESSMENT RESULTS AND STUDENT UNDERSTANDING , 1990 .

[43]  Ryan S. Baker,et al.  Educational Data Mining and Learning Analytics , 2014 .

[44]  Andy P. Field,et al.  Discovering Statistics Using Ibm Spss Statistics , 2017 .

[45]  Abelardo Pardo,et al.  Combining observational and experiential data to inform the redesign of learning activities , 2015, LAK.

[46]  David Lundie Learning Analytics and the Education of the Human , 2014 .

[47]  Kirsti Lonka,et al.  Individual Ways of Interacting with the Learning Environment--Are They Related to Study Success?. , 1998 .

[48]  Bart Rienties,et al.  Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK , 2016 .

[49]  Hendrik Drachsler,et al.  Translating Learning into Numbers: A Generic Framework for Learning Analytics , 2012, J. Educ. Technol. Soc..

[50]  D. Hay Using concept maps to measure deep, surface and non‐learning outcomes , 2007 .