Connecting the STEM dots: measuring the effect of an integrated engineering design intervention

Recent publications have elevated the priority of increasing the integration of Science, Technology, Engineering, and Mathematics (STEM) content for K-12 education. The STEM education community must invest in the development of valid and reliable to scales to measure STEM content, knowledge fusion, and perceptions of the nature of STEM. This brief report discusses the development of an instrument to measure student perceptions of the interdependent nature of STEM content knowledge in the context of a complex classroom intervention implemented in five Colorado high schools (N = 275). Specifically, cross-functional science, technology, engineering, and mathematics teams of high school students were formed to complete engineering design problems. Exploratory (pretest) and confirmatory (posttest) factor analyses indicated that a newly adapted scale to measure student perceptions of the interdependent nature of STEM content knowledge had possessed adequate model fit. Furthermore, analysis revealed a novel pattern of results for the intervention. Specifically, students with initially high perceptions of the interdependent nature of STEM sustained their high perceptions at posttest; however, students with initially low perceptions exhibited statistically significantly positive gains from pretest to posttest. Therefore, this intervention may work best with students who are at risk of losing interest in STEM disciplines. The implications of these research findings are discussed.

[1]  Wayne F. Velicer,et al.  Construct Explication through Factor or Component Analysis: A Review and Evaluation of Alternative Procedures for Determining the Number of Factors or Components , 2000 .

[2]  S. Haynes,et al.  Content validity in psychological assessment: A functional approach to concepts and methods. , 1995 .

[3]  Herbert J. Walberg,et al.  Socio‐psychological Environments and Learning: a quantitative synthesis , 1981 .

[4]  Richard G. Netemeyer,et al.  Scaling Procedures: Issues and Applications , 2003 .

[5]  Karla J. Oty,et al.  The effect of an interdisciplinary algebra/science course on students' problem solving skills, critical thinking skills and attitudes towards mathematics , 2001 .

[6]  Abbie Brown,et al.  Design experiments: Theoretical and methodological challenges in creating complex interventions in c , 1992 .

[7]  Herbert J. Walberg,et al.  CLASSROOM CLIMATE AND INDIVIDUAL LEARNING. , 1968 .

[8]  Helen R. Quinn,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2013 .

[9]  R. Henson,et al.  Use of Exploratory Factor Analysis in Published Research , 2006 .

[10]  R. Glaser The Reemergence of Learning Theory within Instructional Research. , 1990 .

[11]  N. Lackey,et al.  Making Sense of Factor Analysis: The Use of Factor Analysis for Instrument Development in Health Care Research , 2003 .

[12]  N. Lackey,et al.  Making Sense of Factor Analysis , 2003 .

[13]  L. Katehi,et al.  Engineering in K-12 Education: Understanding the Status and Improving the Prospects. , 2009 .

[14]  R. Goffin,et al.  Problems and Solutions in Human Assessment , 2000 .

[15]  James C. Hayton,et al.  Factor Retention Decisions in Exploratory Factor Analysis: a Tutorial on Parallel Analysis , 2004 .

[16]  W. Schmidt,et al.  Opportunity to Learn , 2009 .

[17]  Michael A. de Miranda,et al.  The Grounding of a Discipline: Cognition and Instruction in Technology Education , 2004 .

[18]  W. Velicer Determining the number of components from the matrix of partial correlations , 1976 .

[19]  D. LaLonde,et al.  Science Success Strategies: An Interdisciplinary Course for Improving Science and Mathematics Education , 1998 .

[20]  Barry C. Dart,et al.  Students' Conceptions of Learning, the Classroom Environment, and Approaches to Learning , 2000 .

[21]  J. Bruer Schools for Thought: A Science of Learning in the Classroom , 1993 .