Underlying success in open‐ended investigations in science: using qualitative comparative analysis to identify necessary and sufficient conditions

Both substantive (i.e. factual knowledge, concepts, laws and theories) and procedural knowledge (understanding and applying concepts such as reliability and validity, measurement and calibration, data collection, measurement error, the ability to interpret evidence and the like) are involved in carrying out an open‐ended science investigation. There is some debate as to whether procedural understanding is of little importance compared to substantive understanding or whether – and this is the view we take – procedural ideas can and should be taught explicitly. We present here findings from a study of undergraduate students who took a module which specifically taught procedural ideas. We employ an innovative method, Charles Ragin’s Qualitative Comparative Analysis (QCA), which involves the analysis of necessary and sufficient conditions and conjunctions of causes. Findings from a comparison of the students’ performance before and after the teaching and from QCA imply that procedural understanding was indeed a necessary condition for carrying out an open‐ended investigation. It was also sufficient when combined with either substantive understanding, prior attainment or both.

[1]  M. Polanyi Chapter 7 – The Tacit Dimension , 1997 .

[2]  J. T. Dennis,et al.  Science - A Process Approach. , 1972 .

[3]  Ruth M. Beard,et al.  The Growth of Logical Thinking , 1972 .

[4]  Raymond Boudon The logic of sociological explanation , 1974 .

[5]  E. Jenkins From Armstrong to Nuffield: Studies in Twentieth-Century Science Education in England and Wales , 1980 .

[6]  P. Screen The Warwick Process Science Project. , 1986 .

[7]  Stanley Lieberson,et al.  Making It Count: The Improvement of Social Research and Theory. , 1987 .

[8]  A. Abbott Transcending General Linear Reality , 1988 .

[9]  D. Kuhn,et al.  The development of scientific thinking skills , 1988 .

[10]  M. Shayer Improving Standards and the National Curriculum. , 1991 .

[11]  John Modell,et al.  The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. By Charles C. Ragin (Berkeley, Los Angeles, and London: University of California Press, 1987. Paperback printing, 1989. xvii plus 185 pp.) , 1992 .

[12]  Susan R. Goldman,et al.  Evaluation of Procedure-Based Scoring for Hands-On Science Assessment , 1992 .

[13]  P. Adey The CASE results: implications for science teaching , 1992 .

[14]  M. Shayer,et al.  Accelerating the development of formal thinking in middle and high school students II: Postproject effects on science achievement , 1992 .

[15]  B. Woolnough,et al.  Science Process Skills: are they generalisable? , 1994 .

[16]  Robin Millar,et al.  Investigating in the school science laboratory: conceptual and procedural knowledge and their influence on performance , 1994 .

[17]  Paul J. Germann,et al.  Identifying patterns and relationships among the responses of seventh‐grade students to the science process skill of designing experiments , 1996 .

[18]  L. Schauble,et al.  The development of scientific reasoning in knowledge-rich contexts. , 1996 .

[19]  Paul J. Germann,et al.  Student performances on the science processes of recording data, analyzing data, drawing conclusions, and providing evidence , 1996 .

[20]  Fred Lubben,et al.  Children's ideas about the reliability of experimental data , 1996 .

[21]  R. Gott,et al.  Cognitive acceleration through science education: alternative perspectives , 1998 .

[22]  D. Klahr,et al.  All other things being equal: acquisition and transfer of the control of variables strategy. , 1999, Child development.

[23]  John Leach,et al.  University science students' experiences of investigative project work and their images of science , 1999 .

[24]  D. Klahr,et al.  Bridging Research and Practice: A Cognitively Based Classroom Intervention for Teaching Experimentation Skills to Elementary School Children , 2000 .

[25]  R. Gott,et al.  Procedural understanding in biology: how is it characterised in texts ? , 2000 .

[26]  Pamela Joy Mulhall,et al.  What is the purpose of this experiment? Or can students learn something from doing experiments? , 2000 .

[27]  J. Star On the Relationship Between Knowing and Doing in Procedural Learning , 2020, Proceedings of the Twenty First Annual Conference of the Cognitive Science Society.

[28]  Saalih Allie,et al.  The development of first year physics students' ideas about measurement in terms of point and set paradigms , 2001 .

[29]  R. Tytler,et al.  Public participation in an environmental dispute: Implications for science education , 2001 .

[30]  R. Tytler Dimensions of evidence, the public understanding of science and science education , 2001 .

[31]  Ros Roberts,et al.  Assessment of biology investigations , 2003 .

[32]  Mary Ratcliffe,et al.  What “ideas‐about‐science” should be taught in school science? A Delphi study of the expert community , 2003 .

[33]  Michael Halliday,et al.  Literacy in Science: Learning to Handle Text as Technology , 2003 .

[34]  J. Walkup Fuzzy-set social science , 2003 .

[35]  R. Gott,et al.  A written test for procedural understanding: a way forward for assessment in the UK science curriculum? , 2004 .

[36]  Milena K. Nigam,et al.  The Equivalence of Learning Paths in Early Science Instruction: Effects of Direct Instruction and Discovery Learning , 2022 .

[37]  David F. Treagust,et al.  Inquiry in science education: International perspectives , 2004 .

[38]  R. Gott,et al.  Assessment of Sc1: alternatives to coursework , 2004 .

[39]  D. Kuhn,et al.  Is Developing Scientific Thinking All About Learning to Control Variables? , 2005, Psychological science.

[40]  W. Sandoval Understanding Students' Practical Epistemologies and Their Influence on Learning Through Inquiry , 2005 .

[41]  Charles C. Ragin 1 From Fuzzy Sets to Crisp Truth Tables , 2005 .

[42]  Barry Cooper,et al.  Applying Ragin's Crisp and Fuzzy Set QCA to Large Datasets: Social Class and Educational Achievement in the National Child Development Study , 2005 .

[43]  R. Gott,et al.  Assessment of performance in practical science and pupil attributes , 2006 .

[44]  Charles C. Ragin,et al.  Set Relations in Social Research: Evaluating Their Consistency and Coverage , 2006, Political Analysis.

[45]  Amelia Wenk Gotwals,et al.  Measuring Students' Scientific Content and Inquiry Reasoning , 2006, ICLS.

[46]  Charles C. Ragin,et al.  The Limitations of Net-Effects Thinking , 2006 .

[47]  Richard Gott,et al.  A framework for practical work in science and scientific literacy through argumentation , 2007 .

[48]  H. Schweingruber,et al.  TAKING SCIENCE TO SCHOOL: LEARNING AND TEACHING SCIENCE IN GRADES K-8 , 2007 .

[49]  R. Gott,et al.  Concepts of evidence and their role in open-ended practical investigations and scientific literacy , 2008 .

[50]  R. Gott,et al.  The Roles of Substantive and Procedural Understanding in Open-Ended Science Investigations: Using Fuzzy Set Qualitative Comparative Analysis to Compare Two Different Tasks , 2009 .

[51]  Judith Glaesser,et al.  How has Educational Expansion Changed the Necessary and Sufficient Conditions for Achieving Professional, Managerial and Technical Class Positions in Britain? A Configurational Analysis , 2008 .

[52]  R. Roberts Can teaching about evidence encourage a creative approach in open-ended investigations? , 2009 .