Validity and reliability of the force and motion conceptual evaluation

The assessment of learning has become a key component in program evaluation, grant proposals, and education research. Assessment requires valid and reliable instruments. The Force and Motion Conceptual Evaluation (FMCE) is one of several multiple-choice tests used to evaluate the learning of force and motion concepts. Although many physics education researchers accept its validity and reliability, validity and reliability estimates based on typical statistical analyses of data have not been established. This study used FMCE post-test results for approximately 150 students in a first-semester college physics course to estimate reliability and content validity. The results indicate that the FMCE is a valuable instrument for measuring student learning.

[1]  Susan Ramlo,et al.  The Force and Motion Conceptual Evaluation. , 2002 .

[2]  Ronald K. Thornton Conceptual dynamics: Following changing student views of force and motion , 2008 .

[3]  K. Heller,et al.  Gender differences in science education: The double‐edged role of prior knowledge in physics , 1998 .

[4]  Ronald K. Thornton,et al.  Assessing student learning of Newton’s laws: The Force and Motion Conceptual Evaluation and the Evaluation of Active Learning Laboratory and Lecture Curricula , 1998 .

[5]  D. Gil-Pérez,et al.  What to do about science “misconceptions” , 1990 .

[6]  R. M. Thorndike Measurement and Evaluation in Psychology and Education , 1969 .

[7]  Douglas Huffman,et al.  Interpreting the force concept inventory: A reply to Hestenes and Halloun , 1995 .

[8]  Lillian C. McDermott,et al.  RL-PER1: Resource Letter on Physics Education Research. , 1999 .

[9]  Ronald K. Thornton Using Large-Scale Classroom Research to Study Student Conceptual Learning in Mechanics and to Develop New Approaches to Learning , 1996 .

[10]  R. Hanka,et al.  The scientific use of factor analysis: Raymond B. Cattell Plenum Press, £20.48 , 1981 .

[11]  Ian D. Beatty,et al.  Designing effective questions for classroom response system teaching , 2005, physics/0508114.

[12]  Isadore Newman,et al.  Conceptual Statistics For Beginners , 1993 .

[13]  Alice Boberg,et al.  System-Wide Program Assessment with Performance Indicators: Alberta’s Performance Funding Mechanism , 2001, Canadian Journal of Program Evaluation.

[14]  Ronald K. Thornton,et al.  Learning motion concepts using real‐time microcomputer‐based laboratory tools , 1990 .

[15]  Douglas Huffman,et al.  What does the force concept inventory actually measure , 1995 .

[16]  A. Howe,et al.  Development of science concepts within a Vygotskian framework , 1996 .

[17]  Susan Ramlo A Physicist's Reflection on Q Methodology, Quantum Mechanics & Stephenson , 2005 .

[18]  L. McDermott,et al.  Resource Letter: PER-1: Physics Education Research , 1999 .

[19]  C. Boyle,et al.  Studying conceptual change in learning physics , 1992 .

[20]  Ronald K. Thornton,et al.  Using interactive lecture demonstrations to create an active learning environment , 1997 .

[21]  Susan Ramlo A multivariate assessment of the effect of the laboratory homework component of a microcomputer-based laboratory for a college freshman physics course , 2003 .

[22]  G. Bush U.S. secretary of education margaret spellings , 2005 .

[23]  D. Hestenes,et al.  Force concept inventory , 1992 .

[24]  D. Treagust,et al.  Learning in Science — From Behaviourism Towards Social Constructivism and Beyond , 1998 .

[25]  Daniel J. Mundfrom,et al.  Minimum Sample Size Recommendations for Conducting Factor Analyses , 2005 .

[26]  Isadore Newman,et al.  The Responsibility of Educational Researchers To Make Appropriate Decisions about the Error Rate Unit on Which Type I Error Adjustments Are Based: A Thoughtful Process Not a Mechanical One. , 1998 .

[27]  Jeffery M. Saul,et al.  On the effectiveness of active-engagement microcomputer-based laboratories , 1997 .