Instructional Topics in Educational Measurement (ITEMS) Module: Using Automated Processes to Generate Test Items

Changes to the design and development of our educational assessments are resulting in the unprecedented demand for a large and continuous supply of content-specific test items. One way to address this growing demand is with automatic item generation (AIG). AIG is the process of using item models to generate test items with the aid of computer technology. The purpose of this module is to describe and illustrate a template-based method for generating test items. We outline a three-step approach where test development specialists first create an item model. An item model is like a mould or rendering that highlights the features in an assessment task that must be manipulated to produce new items. Next, the content used for item generation is identified and structured. Finally, features in the item model are systematically manipulated with computer-based algorithms to generate new items. Using this template-based approach, hundreds or even thousands of new items can be generated with a single item model.

[1]  Thomas M. Haladyna,et al.  Using Weak and Strong Theory to Create Item Models for Automatic Item Generation: Some Practical Guidelines with Examples , 2012 .

[2]  Steven M. Downing,et al.  Handbook of test development , 2006 .

[3]  Mark J. Gierl,et al.  Using automatic item generation to create multiple‐choice test items , 2012, Medical education.

[4]  Randy Elliot Bennett,et al.  Item generation and beyond: Applications of schema theory to mathematics assessment. , 2002 .

[5]  Wells HivelyII,et al.  A “UNIVERSE‐DEFINED” SYSTEM OF ARITHMETIC ACHIEVEMENT TESTS1 , 1968 .

[6]  Wim J. van der Linden,et al.  Computerized Adaptive Testing With Item Cloning , 2003 .

[7]  L. M. Laosa,et al.  Segregation of Children Who Migrate to the U.S. From Puerto Rico , 2001 .

[8]  James W Pellegrino,et al.  Technology and Testing , 2009, Science.

[9]  Mark J. Gierl,et al.  Automatic item generation : theory and practice , 2012 .

[10]  Mark Gierl,et al.  Reliability and Attribute-Based Scoring in Cognitive Diagnostic Assessment. , 2009 .

[11]  Lutz F. Hornke Item Generation Models for Higher Order Cognitive Functions , 1999 .

[12]  Mark J. Gierl,et al.  Generating Items Under the Assessment Engineering Framework , 2012 .

[13]  Richard M. Luecht An Introduction to Assessment Engineering for Automatic Item Generation , 2012 .

[14]  M. Oliveri,et al.  The Learning Sciences in Educational Assessment: The Role of Cognitive Models , 2011, Alberta Journal of Educational Research.

[15]  Isaac I. Bejar,et al.  A FEASIBILITY STUDY OF ON‐THE‐FLY ITEM GENERATION IN ADAPTIVE TESTING , 2002 .

[16]  Anne Wendt,et al.  Developing Item Variants: An Empirical Study , 2009 .

[17]  Randy Elliot Bennett,et al.  How the Internet Will Help Large-Scale Assessment Reinvent Itself , 2001 .

[18]  Markus Sommer,et al.  Using Automatic Item Generation to Simultaneously Construct German and English Versions of a Word Fluency Test , 2012 .

[19]  Cornelis A.W. Glas,et al.  Modeling Rule-Based Item Generation , 2011 .

[20]  Markus Sommer,et al.  Using psychometric technology in educational assessment: The case of a schema-based isomorphic approach to the automatic generation of quantitative reasoning items , 2007 .

[21]  Lawrence M. Rudner,et al.  Implementing the Graduate Management Admission Test Computerized Adaptive Test , 2009 .

[22]  Martin Arendasy,et al.  Automatic Generation of Rasch-Calibrated Items: Figural Matrices Test GEOM and Endless-Loops Test EC , 2005 .

[23]  M. Sommer,et al.  Evaluating the contribution of different item features to the effect size of the gender difference in three-dimensional mental rotation using automatic item generation , 2010 .

[24]  Issac I. Bejar A Generative Analysis of a Three-Dimensional Spatial Task , 1990 .

[25]  Mark J. Gierl,et al.  Cognitive diagnostic assessment for education: Theory and applications. , 2007 .

[26]  Susan E. Embretson,et al.  Generating items during testing: Psychometric issues and models , 1999 .

[27]  Mark J. Gierl,et al.  Developing a Taxonomy of Item Model Types to Promote Assessment Engineering , 2008 .

[28]  Christian Gütl,et al.  Enhanced Automatic Question Creator--EAQC: Concept, Development and Evaluation of an Automatic Test Item Creation Tool to Foster Modern e-Education , 2011 .

[29]  David M. Williamson,et al.  Calibrating Item Families and Summarizing the Results Using Family Expected Response Functions , 2003 .

[30]  A. Laduca,et al.  Item modelling procedure for constructing content‐equivalent multiple choice questions , 1986, Medical education.

[31]  Marvin Minsky,et al.  A framework for representing knowledge , 1974 .

[32]  Sandip Sinharay,et al.  Use of Item Models in a Large-Scale Admissions Test: A Case Study , 2008 .

[33]  Mark J. Gierl,et al.  Using Automated Item Generation to Promote Principled Test Design and Development , 2010 .

[34]  Susan E. Embretson,et al.  Understanding and Quantifying Cognitive Complexity Level in Mathematical Problem Solving Items , 2008 .

[35]  D. Borsboom Educational Measurement (4th ed.) , 2009 .

[36]  Thomas M. Haladyna,et al.  Item Shells , 1989 .

[37]  Kevin W. Eva,et al.  The Cambridge Handbook of Expertise and Expert Performance: Expertise in Medicine and Surgery , 2006 .

[38]  Donovan R. Hare,et al.  Assembling an inventory of multistage adaptive testing systems , 2009 .

[39]  Mark J. Gierl,et al.  The Role of Item Models in Automatic Item Generation , 2012 .

[40]  Isaac I. Bejar GENERATIVE RESPONSE MODELING: LEVERAGING THE COMPUTER AS A TEST DELIVERY MEDIUM , 1996 .