What’s in a Word? Extending Learning Factors Analysis to Model Reading Transfer

Learning Factors Analysis (LFA) has been proposed as a generic solution to evaluate and compare cognitive models of learning [1]. By performing a heuristic search over a space of statistical models, the researcher may evaluate different cognitive representations of a set of skills. We introduce a scalable application of this framework in the context of transfer in reading and demonstrate it upon Reading Tutor data. Using an assumption of a word-level model of learning as a baseline, we apply LFA to determine whether a representation with fewer word independencies will produce a better fit for student learning data. Specifically, we show that representing some groups of words as their common root leads to a better fitting model of student knowledge, indicating that this representation offers more information than merely viewing words as independent, atomic skills. In addition, we demonstrate an approximation to LFA which allows it to scale tractably to large datasets. We find that using a word root-based model of learning leads to an improved model fit, suggesting students make use of this information in their representation of words. Additionally, we present evidence based on both model fit and learning rate relationships that low proficiency students tend to exhibit a lesser degree of transfer through the word root representation than higher proficiency students.

[1]  Allen Newell,et al.  Human Problem Solving. , 1973 .

[2]  H. Akaike A new look at the statistical model identification , 1974 .

[3]  R. Woodcock Woodcock Reading Mastery Tests-Revised , 1987 .

[4]  S. Menard Coefficients of Determination for Multiple Logistic Regression Analysis , 2000 .

[5]  Jack Mostow,et al.  Evaluating tutors that listen: an overview of project LISTEN , 2001 .

[6]  Jack Mostow,et al.  Viewing and analyzing multimodal human-computer tutorial dialogue: a database approach , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[7]  J. Beck,et al.  Improving Language Models by Learning from Speech Recognition Errors in a Reading Tutor that Listens , 2003 .

[8]  Neil T. Heffernan,et al.  Why Are Algebra Word Problems Difficult? Using Tutorial Log Files and the Power Law of Learning to Select the Best Fitting Cognitive Model , 2004, Intelligent Tutoring Systems.

[9]  Jonathan E Freyberger,et al.  Using Association Rules to Guide a Search for Best Fitting Transfer Models of Student Learning , 2004 .

[10]  Jack Mostow,et al.  Using Speech Recognition to Evaluate Two Student Models for a Reading Tutor , 2005 .

[11]  Joseph E. Beck,et al.  Using learning decomposition to analyze student fluency development , 2005 .

[12]  Automating Cognitive Model Improvement by A * Search and Logistic Regression , 2005 .

[13]  Kenneth R. Koedinger,et al.  Learning Factors Analysis - A General Method for Cognitive Model Evaluation and Improvement , 2006, Intelligent Tutoring Systems.

[14]  David Hinkley,et al.  Bootstrap Methods: Another Look at the Jackknife , 2008 .