A psychometric evaluation of the digital logic concept inventory

Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students’ conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students’ ability levels.

[1]  M. R. Novick,et al.  Statistical Theories of Mental Test Scores. , 1971 .

[2]  Michael C. Loui,et al.  Students' Misconceptions About Medium-Scale Integrated Circuits , 2011, IEEE Transactions on Education.

[3]  Michael C. Loui,et al.  Creating the digital logic concept inventory , 2010, SIGCSE.

[4]  Robert J. Dufresne,et al.  Promoting skilled problem‐solving behavior among beginning physics students , 1993 .

[5]  Leen-Kiat Soh,et al.  Concept inventories in computer science for the topic discrete mathematics , 2006, ACM SIGCSE Bull..

[6]  Cynthia Taylor,et al.  Computer science concept inventories: past and future , 2014, Comput. Sci. Educ..

[7]  D. Hestenes,et al.  Force concept inventory , 1992 .

[8]  R. Hake Lessons from the Physics Education Reform Effort , 2001, physics/0106087.

[9]  Mark Guzdial,et al.  Assessing fundamental introductory computing concept knowledge in a language independent manner , 2010 .

[10]  R. Darrell Bock,et al.  Fitting a response model forn dichotomously scored items , 1970 .

[11]  T. Kline Psychological Testing: A Practical Approach to Design and Evaluation , 2005 .

[12]  Cynthia Taylor,et al.  Developing a pre- and post-course concept inventory to gauge operating systems learning , 2014, SIGCSE.

[13]  Mark Clayton Delphi: a technique to harness expert opinion for critical decision‐making tasks in education , 1997 .

[14]  M. W. Richardson,et al.  The theory of the estimation of test reliability , 1937 .

[15]  Xu Yu-bo,et al.  Reachability Checking of Finite Precision Timed Automata , 2006 .

[16]  Timothy F. Slater,et al.  Development and Validation of the Light and Spectroscopy Concept Inventory , 2006 .

[17]  Andrea. Stone A psychometric analysis of the Statistics Concept Inventory. , 2006 .

[18]  Hung-Wei Tseng,et al.  Evaluating student understanding of core concepts in computer architecture , 2013, ITiCSE '13.

[19]  Colin S. Wallace,et al.  Do Concept Inventories Actually Measure Anything , 2010 .

[20]  Peggy Noel Van Meter,et al.  A Cognitive Study of Problem Solving in Statics , 2008 .

[21]  Robert J. Beichner,et al.  Approaches to Data Analysis of Multiple-Choice Questions. , 2009 .

[22]  Michelene T. H. Chi,et al.  Laboratory Methods for Assessing Experts' and Novices' Knowledge , 2006 .

[23]  James D. Slotta,et al.  The Ontological Coherence of Intuitive Physics , 1993 .

[24]  N. Leech,et al.  An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation. , 2007 .

[25]  W. Brewer,et al.  Mental models of the earth: A study of conceptual change in childhood , 1992, Cognitive Psychology.

[26]  R. Hambleton,et al.  Fundamentals of Item Response Theory , 1991 .

[27]  Michael C. Loui,et al.  Flip-Flops in Students' Conceptions of State , 2012, IEEE Transactions on Education.

[28]  James W. Pellegrino,et al.  Cambridge Handbook of Engineering Education Research: The Science and Design of Assessment in Engineering Education , 2014 .

[29]  Michael C. Loui,et al.  How do students misunderstand number representations? , 2011, Comput. Sci. Educ..

[30]  T. Reed-Rhoads,et al.  Progress on concept inventory assessment tools , 2003, 33rd Annual Frontiers in Education, 2003. FIE 2003..

[31]  R. Sitgreaves Psychometric theory (2nd ed.). , 1979 .

[32]  Jack Barbera,et al.  Psychometric analysis of the thermochemistry concept inventory , 2014 .

[33]  Dimitris Rizopoulos,et al.  ltm: An R Package for Latent Variable Modeling and Item Response Analysis , 2006 .

[34]  R. Glaser,et al.  Knowing What Students Know: The Science and Design of Educational Assessment , 2001 .

[35]  L. Crocker,et al.  Introduction to Classical and Modern Test Theory , 1986 .

[36]  D. Borsboom Measuring the mind: Conceptual issues in contemporary psychometrics , 2005 .

[37]  Michael C. Loui,et al.  Describing the What and Why of Students’ Difficulties in Boolean Logic , 2012, TOCE.

[38]  Jack Barbera,et al.  A Psychometric Analysis of the Chemical Concepts Inventory , 2013 .

[39]  Geoffrey L. Herman,et al.  Assessing the application of three theories of conceptual change to interdisciplinary data sets , 2012, 2012 Frontiers in Education Conference Proceedings.

[40]  Michael C. Loui,et al.  AC2012-4637: IDENTIFYINGTHECORECONCEPTUALFRAMEWORK OF DIGITAL LOGIC , 2012 .

[41]  Michael C. Loui,et al.  Setting the Scope of Concept Inventories for Introductory Computing Subjects , 2010, TOCE.

[42]  Michael C. Loui,et al.  Administering a Digital Logic Concept Inventory at Multiple Institutions , 2011 .

[43]  Anton Beguin,et al.  Using Classical Test Theory in Combination with Item Response Theory , 2003 .

[44]  Andrea A. diSessa,et al.  Coherence versus fragmentation in the development of the concept of force , 2004, Cogn. Sci..

[45]  Juri Pill,et al.  The Delphi method: Substance, context, a critique and an annotated bibliography , 1971 .

[46]  Darren George,et al.  SPSS for Windows Step by Step: A Simple Guide and Reference , 1998 .

[47]  Alan Dix,et al.  About reasoning and thinking , 2003 .

[48]  Robert J. Beichner,et al.  Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment , 2006 .

[49]  R. Hake Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses , 1998 .

[50]  Barbara M. Olds,et al.  Rigorous Methodology for Concept Inventory Development: Using the 'Assessment Triangle' to Develop and Test the Thermal and Transport Science Concept Inventory (TTCI)* , 2011 .

[51]  R. Hambleton,et al.  An NCME Instructional Module on Comparison of Classical Test Theory and Item Response Theory and Their Applications to Test Development. , 2005 .

[52]  M. Chi,et al.  Naive Physics Reasoning: A Commitment to Substance-Based Conceptions , 2000 .

[53]  Paul S. Steif,et al.  A Statics Concept Inventory: Development and Psychometric Analysis , 2005 .