Measuring growth in students’ proficiency in MOOCs: Two component dynamic extensions for the Rasch model

Massive open online courses (MOOCs) are increasingly popular among students of various ages and at universities around the world. The main aim of a MOOC is growth in students’ proficiency. That is why students, professors, and universities are interested in the accurate measurement of growth. Traditional psychometric approaches based on item response theory (IRT) assume that a student’s proficiency is constant over time, and therefore are not well suited for measuring growth. In this study we sought to go beyond this assumption, by (a) proposing to measure two components of growth in proficiency in MOOCs; (b) applying this idea in two dynamic extensions of the most common IRT model, the Rasch model; (c) illustrating these extensions through analyses of logged data from three MOOCs; and (d) checking the quality of the extensions using a cross-validation procedure. We found that proficiency grows both across whole courses and within learning objectives. In addition, our dynamic extensions fit the data better than does the original Rasch model, and both extensions performed well, with an average accuracy of .763 in predicting students’ responses from real MOOCs.

[1]  P. Desmet,et al.  Psychometrics of MOOCs: Measuring Learners’ Proficiency , 2020, Psychologica Belgica.

[2]  R. Hambleton,et al.  Fundamentals of Item Response Theory , 1991 .

[3]  Cees A. W. Glas,et al.  Dynamic Generalizations of the Rasch Model , 1995 .

[4]  Georg Rasch,et al.  Probabilistic Models for Some Intelligence and Attainment Tests , 1981, The SAGE Encyclopedia of Research Design.

[5]  Cornelis A.W. Glas,et al.  A dynamic generalization of the Rasch model , 1993 .

[6]  A. Elo The rating of chessplayers, past and present , 1978 .

[7]  Dimitrios Rizopoulos ltm: An R Package for Latent Variable Modeling and Item Response Theory Analyses , 2006 .

[8]  Abe D. Hofman,et al.  Tracking with (Un)Certainty , 2020, Journal of Intelligence.

[9]  S. Klinkenberg,et al.  Computer adaptive practice of Maths ability using a new item response model for on the fly ability and difficulty estimation , 2011, Comput. Educ..

[10]  Eva Ceulemans,et al.  Modeling Growth in Electronic Learning Environments Using a Longitudinal Random Item Response Model , 2015 .

[11]  F. Davis Educational measurements and their interpretation , 1965 .

[12]  Dimitris Rizopoulos,et al.  ltm: An R Package for Latent Variable Modeling and Item Response Analysis , 2006 .

[13]  M. Meulders,et al.  Cross-Classification Multilevel Logistic Models in Psychometrics , 2003 .

[14]  Damazo T. Kadengye,et al.  A generalized longitudinal mixture IRT model for measuring differential growth in learning environments , 2013, Behavior research methods.

[15]  Chaitanya Ekanadham,et al.  T-SKIRT: Online Estimation of Student Proficiency in an Adaptive Learning System , 2017, ArXiv.

[16]  M. R. Novick,et al.  Statistical Theories of Mental Test Scores. , 1971 .

[17]  Tom Verguts,et al.  A Rasch Model for Detecting Learning While Solving an Intelligence Test , 2000 .

[18]  Erling B. Andersen,et al.  Estimating latent correlations between repeated testings , 1985 .

[19]  D. Bates,et al.  Fitting Linear Mixed-Effects Models Using lme4 , 2014, 1406.5823.

[20]  H. Akaike A new look at the statistical model identification , 1974 .

[21]  Ivo W. Molenaar,et al.  Some Background for Item Response Theory and the Rasch Model , 1995 .

[22]  Alina A. von Davier,et al.  Computational Psychometrics in Support of Collaborative Educational Assessments , 2017 .

[23]  Gerhard H. Fischer,et al.  Linear Logistic Models for Change , 1995 .

[24]  Abe D. Hofman,et al.  The estimation of item response models with the lmer function from the lme4 package in R , 2011 .

[25]  F. Drasgow,et al.  Modified parallel analysis: A procedure for examining the latent dimensionality of dichotomously scored item responses. , 1983 .