Understanding the Role of Time on Task in Formative Assessment: The Case of Mathematics Learning

Mastery data derived from formative assessments constitute a rich data set in the development of student performance prediction models. The dominance of formative assessment mastery data over use intensity data such as time on task or number of clicks was the outcome of previous research by the authors in a dispositional learning analytics context [1, 2, 3]. Practical implications of these findings are far reaching, contradicting current practices of developing (learning analytics based) student performance prediction models based on intensity data as central predictor variables. In this empirical follow-up study using data of 2011 students, we search for an explanation for time on task data being dominated by mastery data. We do so by investigating more general models, allowing for nonlinear, even non-monotonic, relationships between time on task and performance measures. Clustering students into subsamples, with different time on task characteristics, suggests heterogeneity of the sample to be an important cause of the nonlinear relationships with performance measures. Time on task data appear to be more sensitive to the effects of heterogeneity than mastery data, providing a further argument to prioritize formative assessment mastery data as predictor variables in the design of prediction models directed at the generation of learning feedback.

[1]  Susanne Narciss,et al.  Fostering achievement and motivation with bug-related tutoring feedback in a computer-based training for written subtraction. , 2006 .

[2]  Andrew J. Martin Examining a multidimensional model of student motivation and engagement using a construct validation approach. , 2007, The British journal of educational psychology.

[3]  Dirk T. Tempelaar,et al.  Stability and Sensitivity of Learning Analytics based Prediction Models , 2015, CSEDU.

[4]  Dirk T. Tempelaar,et al.  Computer Assisted, Formative Assessment and Dispositional Learning Analytics in Learning Mathematics and Statistics , 2014, CAA.

[5]  S. Narciss Feedback Strategies for Interactive Learning Tasks , 2007 .

[6]  Norbert Pachler,et al.  Scoping a vision for formative e-assessment: a project report for JISC , 2009 .

[7]  Dirk T. Tempelaar,et al.  In search for the most informative data for feedback generation: Learning analytics in a data-rich context , 2015, Comput. Hum. Behav..

[8]  Simon Buckingham Shum,et al.  Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics , 2012, International Conference on Learning Analytics and Knowledge.

[9]  Anne C. Frenzel,et al.  Measuring emotions in students learning and performance: The Achievement Emotions Questionnaire (AE , 2011 .

[10]  Rebecca Ferguson,et al.  Examining engagement: analysing learner subpopulations in massive open online courses (MOOCs) , 2015, LAK.

[11]  Dirk T. Tempelaar,et al.  Formative assessment and learning analytics , 2013, LAK '13.

[12]  Erik Duval,et al.  Dataset-Driven Research to Support Learning and Knowledge Analytics , 2012, J. Educ. Technol. Soc..

[13]  P. Black,et al.  Developing the theory of formative assessment , 2009 .

[14]  Reinhard Pekrun,et al.  Perceived Academic Control and Failure in College students: A Three-Year Study of Scholastic Attainment , 2005 .

[15]  Carol Calvert Developing a model and applications for probabilities of student success: a case study of predictive analytics , 2014 .

[16]  Dirk T. Tempelaar,et al.  How achievement emotions impact students' decisions for online learning, and what precedes those emotions , 2012, Internet High. Educ..

[17]  Parisa Babaali,et al.  A quantitative analysis of the relationship between an online homework system and student achievement in pre-calculus , 2015 .

[18]  Bart Rienties,et al.  "Scaling up" learning design: impact of learning design activities on LMS behavior and performance , 2015, LAK.

[19]  Martin Hlosta,et al.  OU Analyse: analysing at-risk students at The Open University , 2015 .