Optimally Discriminative Choice Sets in Discrete Choice Models: Application to Data-Driven Test Design
暂无分享,去创建一个
Difficult test questions can be made easy by providing a set of possible answer options of which most are obviously wrong. In the education literature, a plethora of instructional guides exist for crafting a suitable set of wrong choices (distractors) in order to probe the students' understanding of the tested concept. The art of multiple-choice question design thus hinges on the question-maker's experience and knowledge of the potential misconceptions. In contrast, we advocate a data-driven approach, where correct and incorrect options are assembled directly from the students' own past submissions. Large-scale online classroom settings, such as massively open online courses (MOOCs), provide an opportunity to design optimal and adaptive multiple-choice questions that are maximally informative about the students' level of understanding of the material. We deploy a multinomial-logit discrete choice model for the setting of multiple choice testing, derive an optimization objective for selecting optimally discriminative option sets, and demonstrate the effectiveness of our approach via a user study.
[1] F. Lord. Applications of Item Response Theory To Practical Testing Problems , 1980 .
[2] I. W. Molenaar,et al. Rasch models: foundations, recent developments and applications , 1995 .
[3] Tom Minka,et al. How To Grade a Test Without Knowing the Answers - A Bayesian Graphical Model for Adaptive Crowdsourcing and Aptitude Testing , 2012, ICML.
[4] Michael C. Rodriguez. Three Options Are Optimal for Multiple‐Choice Items: A Meta‐Analysis of 80 Years of Research , 2005 .