Predicting Choices of Item Difficulty in Self-Adapted Testing Using Hidden Markov Models

Self-adapted testing is designed to allow examinees to choose the level of difficulty of the items they receive. This results in different levels of overall difficulty across exams, but examinees ability can be estimated regardless of the items chosen using Item Response Theory. Here we also evaluated whether an examinees selection process could be informative in assessing ability, engagement, and mindset. Two groups of examinees completed a self-adaptive general knowledge test under different instructions, one emphasizing performance (fixed mindset) and one emphasizing learning (growth mindset). We modeled examinees choices of item difficulty using a Hidden Markov Model to predict whether they transition between difficulty levels based on their goal condition, the correctness of their last answer, the level of confidence in their last answer, and the interactions therein. Preliminary results suggest higher likelihood of examinees choosing more difficult items following correct responses, high confidence, and learning (growth) instructions.