A Comparison of Paper-Based and Web-Based Testing.

The purpose of this study (1,313 college student participants) was to examine the differences in paper-based and Web-based administrations of a commonly used assessment instrument, the Force Concept Inventory (FCI) (D. Hestenes, M. Wells, and G. Swackhamer, 1992). Results demonstrated no appreciable difference on FCI scores or FCI items based on the type of administration. Analyses demonstrated differences in FCI scores due to gender and time of administration (pre-test and post-test). However, none of these differences was influenced by the type of test administration (Web or paper). Similarly, FCI student scores were comparable with respect to test reliability. For individual FCI items, paper-based and Web-based comparisons were made by examining potential differences in item means and by examining potential differences in response patterns. Chi Squares demonstrated no differences in response patterns and t-Tests demonstrated no differences in item means between paper-based and Web-based administrations. In summary, the Web-based administration of the FCI appears to be as efficacious as the paper-based administration. Lessons learned from the implementation of Web-administered testing are also discussed. (Contains 2 figures, 4 tables, and 27 references.) (Author/SLD) Reproductions supplied by EDRS are the best that can be made from the original document. Cole, Mac Isaac & Cole Papervs. Web-Based Testing 1 A Comparison of Paper-based and Web-based Testing Rebecca Pollard Cole Dan Maclsaac David M. Cole Northern Arizona University U.S. DEPARTMENT OF EDUCATION Office of Educational Research and Improvement EDUCATIONAL RESOURCES INFORMATION CENTER (ERIC) .-This document has been reproduced as received from the person or organization originating it. Minor changes have been made to in improve reproduction quality. Points of view or opinions stated in this document do not necessarily represent N official OERI position or policy. O 2 1 PERMISSION TO REPRODUCE AND DISSEMINATE THIS MATERIAL HAS BEEN GRANTED BY

[1]  Jeffery M. Saul,et al.  Student expectations in introductory physics , 1998 .

[2]  L. E. Klopfer,et al.  Factors influencing the learning of classical mechanics , 1980 .

[3]  Douglas Huffman,et al.  What does the force concept inventory actually measure , 1995 .

[4]  Carolyn S. Clausing,et al.  Does Computer Screen Color Affect Reading Rate , 1990 .

[5]  D. Hestenes,et al.  Force concept inventory , 1992 .

[6]  R. Linn Educational measurement, 3rd ed. , 1989 .

[7]  R. Hake Interactive-engagement vs Traditional Methods in Mechanics Instruction* , 1998 .

[8]  A B Arons,et al.  Computer-Based Instructional Dialogs in Science Courses , 1984, Science.

[9]  Ibrahim A. Halloun,et al.  The initial knowledge state of college physics students , 1985 .

[10]  David Hestenes,et al.  Interpreting the Force Concept Inventory A response to Huffman and Heller , 1995 .

[11]  Robert J. Beichner,et al.  Web-based testing in physics education: methods and opportunities , 1998 .

[12]  Theo J. H. M. Eggen,et al.  Computerized Adaptive Testing: What It is and How It Works. , 1998 .

[13]  Carolyn S. Clausing,et al.  Paper versus CRT--Are Reading Rate and Comprehension Affected?. , 1990 .

[14]  Steven L. Wise,et al.  Computer-Based Testing in Higher Education. , 1990 .

[15]  David Zandvliet,et al.  A Comparison of Computer-Administered and Written Tests , 1997 .

[16]  Michael Waugh,et al.  Effects of microcomputer‐administered diagnostic testing on immediate and continuing science achievement and attitudes , 1985 .

[17]  Arnold B. Arons,et al.  Overcoming conceptual difficulties in physical science through computer-based Socratic dialogs , 1986 .

[18]  Alfred Bork,et al.  Learning with computers , 1981 .

[19]  Carolyn S. Clausing,et al.  The Effects of Computer Usage on Computer Screen Reading Rate. , 1989 .