Items, Skills, and Transfer Models: Which Really Matters for Student Modeling?

Student modeling is broadly used in educational data mining and intelligent tutoring systems for making scientific discoveries and for guiding instruction. For both of these goals, having high model accuracy is important, and researchers have incorporated a variety of features into student models. However, since different techniques use various features, when evaluating those approaches, we could not easily figure out what is key for a high predictive accuracy: the model or the features. In this paper, to establish such knowledge, we performed empirical studies varying which features the models considered such as items, skills, and transfer models. We found that item difficulty is a better predictor than skill difficulty or student proficiencies on the transfer model. Moreover, we evaluated two versions of the PFA model; the one with item difficulty resulted in slightly higher predictive accuracy than the one with skill difficulty. In addition, prior work has shown that considering student overall proficiencies, not just those thought to be important by the transfer model, works substantially better on ASSISTments data. However, in this study, we failed to find consistency of this phenomenon on the data collected from the Cognitive Tutor.

[1]  Kenneth R. Koedinger,et al.  Learning Factors Analysis - A General Method for Cognitive Model Evaluation and Improvement , 2006, Intelligent Tutoring Systems.

[2]  Neil T. Heffernan,et al.  The impact of gaming ( ? ) on learning at the fine-grained level , 2010 .

[3]  Kenneth R. Koedinger,et al.  Performance Factors Analysis - A New Alternative to Knowledge Tracing , 2009, AIED.

[4]  Neil T. Heffernan,et al.  Does Self-Discipline impact students' knowledge and learning? , 2009, EDM.

[5]  Beverly Park Woolf,et al.  Inferring learning and attitudes from a Bayesian Network of log file data , 2005, AIED.

[6]  Zachary A. Pardos,et al.  Determining the Significance of Item Order In Randomized Problem Sets , 2009, EDM.

[7]  Neil T. Heffernan,et al.  Addressing the assessment challenge with an online system that tutors as it assesses , 2009, User Modeling and User-Adapted Interaction.

[8]  Neil T. Heffernan,et al.  Using Learning Decomposition to Analyze Instructional Effectiveness in the ASSISTment System , 2009, AIED.

[9]  Neil T. Heffernan,et al.  How to Construct More Accurate Student Models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis , 2011, Int. J. Artif. Intell. Educ..

[10]  Ryan Shaun Joazeiro de Baker,et al.  Detecting Student Misuse of Intelligent Tutoring Systems , 2004, Intelligent Tutoring Systems.

[11]  Yue Gong,et al.  Using Dirichlet priors to improve model parameter plausibility , 2009, EDM.

[12]  Yue Gong,et al.  Looking beyond transfer models: finding other sources of power for student models , 2011, UMAP'11.

[13]  R. Hambleton,et al.  Handbook of Modern Item Response Theory , 1997 .

[14]  R. J. Mokken,et al.  Handbook of modern item response theory , 1997 .

[15]  Zachary A. Pardos,et al.  Modeling Individualization in a Bayesian Networks Implementation of Knowledge Tracing , 2010, UMAP.

[16]  John R. Anderson,et al.  Knowledge tracing: Modeling the acquisition of procedural knowledge , 2005, User Modeling and User-Adapted Interaction.

[17]  Kenneth R. Koedinger,et al.  Learning Factors Transfer Analysis: Using Learning Curve Analysis to Automatically Generate Domain Models , 2009, EDM.

[18]  B. Bloom The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring , 1984 .

[19]  Zachary A. Pardos,et al.  KT-IDEM: introducing item difficulty to the knowledge tracing model , 2011, UMAP'11.