Kyllonen, 2002) is a rapidly evolving research area where cognitive and psychometric theories are used to produce tests that contain items created using computer technology. AIG serves as a technology-enhanced approach to test development that addresses one of the most pressing and challenging issues facing educators today—the rapid and efficient production of high-quality, content-specific, test items. AIG can be characterized as the process of using models to generate items with the aid of computer technology. The role of the test development specialist is critical for the creative task of identifying the required content as well as designing and developing meaningful item models. The role of computer technology is critical for the algorithmic task of systematically combining large amounts of content in each item model to produce new assessment tasks. By combining content expertise with computer technology, testing specialists can produce models that yield large numbers of high-quality items in a short period of time. There are practical reasons why large numbers of test items are needed in modern 21 st century assessment programs. A flexible and accommodating administration schedule has become a basic requirement in most programs because examinees have come to expect continuous, on-demand testing while decision makers want instant access to information about these examinees. But with flexibility also comes security risk and, hence, adequate item exposure controls are needed to ensure that the administration is secure so each test yields fair and equitable information about all examinees. Typically, a bank is developed which serve as a repository for the items as well as a database to maintain information about these items, including their content codes, psychometric characteristics, and usage rates. With frequent testing, these banks must be continually replenished with new items to ensure that examinees receive a constant supply of unique, content-specific, Multilingual AIG 3 assessment tasks while, at the same time, limiting item exposure within the testing environment to maintain security. For much of the 20 th century, tests were developed and administered in one language. But profound global, technological, and economic changes occurring at the end of the 20 th century have resulting in a dramatic increase in multilingual testing. Educational and psychological tests are now developed and administered to examinees in different languages across diverse cultures throughout the world (Hambleton, Merenda, & Spielberger, 2005). As a result, large numbers of items are not only required to promote flexible administration with adequate security, but …
[1]
Issac I. Bejar.
A Generative Analysis of a Three-Dimensional Spatial Task
,
1990
.
[2]
Isaac I. Bejar,et al.
A FEASIBILITY STUDY OF ON‐THE‐FLY ITEM GENERATION IN ADAPTIVE TESTING
,
2002
.
[3]
Steven M. Downing,et al.
Handbook of test development
,
2006
.
[4]
Isaac I. Bejar.
GENERATIVE RESPONSE MODELING: LEVERAGING THE COMPUTER AS A TEST DELIVERY MEDIUM
,
1996
.
[5]
Mark J. Gierl,et al.
Automatic item generation : theory and practice
,
2012
.
[6]
R. Hambleton,et al.
Adapting educational and psychological tests for cross-cultural assessment
,
2004
.
[7]
Anne Wendt,et al.
Developing Item Variants: An Empirical Study
,
2009
.
[8]
Thomas M. Haladyna,et al.
Using Weak and Strong Theory to Create Item Models for Automatic Item Generation: Some Practical Guidelines with Examples
,
2012
.
[9]
Mark J. Gierl,et al.
Using automatic item generation to create multiple‐choice test items
,
2012,
Medical education.
[10]
Martin Arendasy,et al.
Automatic Generation of Rasch-Calibrated Items: Figural Matrices Test GEOM and Endless-Loops Test EC
,
2005
.
[11]
Mark J. Gierl,et al.
Developing a Taxonomy of Item Model Types to Promote Assessment Engineering
,
2008
.
[12]
Susan E. Embretson,et al.
Understanding and Quantifying Cognitive Complexity Level in Mathematical Problem Solving Items
,
2008
.
[13]
Paul Deane,et al.
MULTILINGUAL GENERALIZATION OF THE MODELCREATOR SOFTWARE FOR MATH ITEM GENERATION
,
2005
.
[14]
A. Laduca,et al.
Item modelling procedure for constructing content‐equivalent multiple choice questions
,
1986,
Medical education.
[15]
Markus Sommer,et al.
Using psychometric technology in educational assessment: The case of a schema-based isomorphic approach to the automatic generation of quantitative reasoning items
,
2007
.
[16]
John Cresswell,et al.
PISA 2009 Technical Report
,
2012
.