Describing undergraduate STEM teaching practices: a comparison of instructor self-report instruments

BackgroundCollecting data on instructional practices is an important step in planning and enacting meaningful initiatives to improve undergraduate science instruction. Self-report survey instruments are one of the most common tools used for collecting data on instructional practices. This paper is an instrument- and item-level analysis of available instructional practice instruments to survey postsecondary instructional practices. We qualitatively analyzed the instruments to document their features and methodologically sorted their items into autonomous categories based on their content. The paper provides a detailed description and evaluation of the instruments, identifies gaps in the literature, and provides suggestions for proper instrument selection, use, and development based on these findings.ResultsThe 12 instruments we analyzed use a variety of measurement and development approaches. There are two primary instrument types: those intended for all postsecondary instructors and those intended for instructors in a specific STEM discipline. The instruments intended for all instructors often focus on teaching as well as other aspects of faculty work. The number of teaching practice items and response scales varied widely. Most teaching practice items referred to the format of in-class instruction (54 %), such as group work or problem solving. Another important type of teaching practice items referred to assessment practices (35 %), frequently focusing on specific types of summative assessment items used.ConclusionsThe recent interest in describing teaching practices has led to the development of a diverse set of available self-report instruments. Many instruments lack an audit trail of their development, including rationale for response scales; whole instrument and construct reliability values; and face, construct, and content validity measures. Future researchers should consider building on these existing instruments to address some of their current weaknesses. In addition, there are important aspects of instruction that are not currently described in any of the available instruments. These include laboratory-based instruction, hybrid and online instructional environments, and teaching with elements of universal design.

[1]  Steve Feyl,et al.  Applying the Seven Principles for Good Practice in Undergraduate Education: Improving Research Writing Skills in a Writing-Emphasis Health Counseling Course , 2004 .

[2]  C. Henderson,et al.  Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature , 2011 .

[3]  Stephen Joel Coons,et al.  A Comparative Review of Generic Quality-of-Life Instruments , 2012, PharmacoEconomics.

[4]  Jacques Grégoire,et al.  International test commission , 2010 .

[5]  J. Pryor,et al.  CIRP Construct Technical Report , 2010 .

[6]  Jeffrey E. Froyd,et al.  Fidelity of Implementation of Research‐Based Instructional Strategies (RBIS) in Engineering Science Courses , 2013 .

[7]  M. Niewiadomska-Bugaj,et al.  Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? , 2012 .

[8]  Zsolt Lavicza Integrating technology into mathematics teaching at the university level , 2010 .

[9]  B. Thompson,et al.  Factor Analytic Evidence for the Construct Validity of Scores: A Historical Overview and Some Guidelines , 1996 .

[10]  R. Felder,et al.  SUCCEED Faculty Survey of Teaching Practices and Perceptions of Institutional Attitudes toward Teaching, 1999-2000. , 2001 .

[11]  Melvin D. George,et al.  Effective practices in undergraduate STEM education part 1: examining the evidence. , 2009, CBE life sciences education.

[12]  Thomas A. Angelo,et al.  Classroom Assessment Techniques: A Handbook for College Teachers. Second Edition. , 1993 .

[13]  D. Watson,et al.  Constructing validity: Basic issues in objective scale development , 1995 .

[14]  Colin Seymour-Ure,et al.  Content Analysis in Communication Research. , 1972 .

[15]  G. F. Bishop,et al.  EXPERIMENTS WITH THE MIDDLE RESPONSE ALTERNATIVE IN SURVEY QUESTIONS , 1987 .

[16]  J. Bligh Improving student learning , 2002, Medical education.

[17]  R. Weber Basic content analysis, 2nd ed. , 1990 .

[18]  Paul D. Umbach,et al.  Faculty do Matter: The Role of College Faculty in Student Learning and Engagement , 2005 .

[19]  Richard A. Parker,et al.  Designing and Conducting Survey Research: A Comprehensive Guide , 1992 .

[20]  Thomas A. Angelo,et al.  Classroom Assessment Techniques. A Handbook for Faculty. , 1988 .

[21]  Norman G. Lederman,et al.  Handbook of Research on Science Education , 2023 .

[22]  Noah D. Finkelstein,et al.  Facilitating Change in Undergraduate STEM Education , 2012 .

[23]  Ernest T. Pascarella,et al.  How College Affects Students: A Third Decade of Research. Volume 2. , 2005 .

[24]  Graham Gibbs,et al.  The Impact Of Training Of University Teachers on their Teaching Skills, their Approach to Teaching and the Approach to Learning of their Students , 2004 .

[25]  J. Nunnally Psychometric Theory (2nd ed), New York: McGraw-Hill. , 1978 .

[26]  Noah D. Finkelstein,et al.  Not all interactive engagement is the same: Variations in physics professors' implementation of Peer Instruction , 2009 .

[27]  David Matsumoto,et al.  The Handbook of Culture and Psychology , 2001 .

[28]  Melissa H. Dancy,et al.  Pedagogical practices and instructional change of physics faculty , 2010 .

[29]  Keith Trigwell,et al.  Approaches to Teaching Design Subjects: a quantitative analysis , 2002 .

[30]  Joan Garfield,et al.  The Statistics Teaching Inventory: A Survey on Statistics Teachers' Classroom Practices and Beliefs , 2012 .

[31]  R. Weber Basic Content Analysis , 1986 .

[32]  Larry Johnson,et al.  Technology Outlook for STEM+ Education 2013-2018: An NMC Horizon Project Sector Analysis. , 2013 .

[33]  Manjula D. Sharma,et al.  Link maps and map meetings: Scaffolding student learning , 2009 .

[34]  R. Felder,et al.  A Survey of Faculty Teaching Practices and Involvement in Faculty Development Activities , 2002 .

[35]  Douglas Dykeman Technology outlook , 2000 .

[36]  Melissa H. Dancy,et al.  Assessment of teaching effectiveness: Lack of alignment between instructors, institutions, and research recommendations , 2014 .

[37]  Carl Wieman,et al.  The Teaching Practices Inventory: A New Tool for Characterizing College and University Teaching in Mathematics and Science , 2014, CBE life sciences education.

[38]  Melissa H. Dancy,et al.  Impact of physics education research on the teaching of introductory quantitative physics in the United States , 2009 .

[39]  Steven E. Stemler,et al.  An Overview of Content Analysis. , 2001 .

[40]  Sally S. Scott,et al.  Universal Design for Instruction , 2003 .

[41]  L. Cronbach Test “reliability”: Its meaning and determination , 1947, Psychometrika.

[42]  Tracy C. Davis How College Affects Students (Vol. 2): A Third Decade of Research (review) , 2006 .

[43]  David Royse,et al.  Research methods in social work , 1991 .

[44]  C. Manduca,et al.  Teaching Methods in Undergraduate Geoscience Courses: Results of the 2004 On the Cutting Edge Survey of U.S. Faculty , 2005 .

[45]  MacKenzie R. Stetzer,et al.  A Campus-Wide Study of STEM Courses: New Perspectives on Teaching Practices and Perceptions , 2014, CBE life sciences education.

[46]  Paul Ginns,et al.  Phenomenographic pedagogy and a revised Approaches to teaching inventory , 2005 .

[47]  Shelley E. Taylor,et al.  Social psychology, 8th ed. , 1994 .

[48]  K. Trigwell,et al.  Relations between teachers' approaches to teaching and students' approaches to learning , 1999 .

[49]  International Test Commission International Guidelines for Test Use , 2001 .

[50]  Allison BrckaLorenz,et al.  Testing the New Scales on the Faculty Survey of Student Engagement , 2014 .

[51]  S. Urbina,et al.  Psychological testing, 7th ed. , 1997 .

[52]  Ruth Johnson,et al.  Applying the Seven Principles for Good Practice in Undergraduate Education to Blended Learning Environments , 2014 .

[53]  Wayne F. Cascio,et al.  Magnitude estimations of expressions of frequency and amount. , 1974 .

[54]  K. Trigwell,et al.  Development and Use of the Approaches to Teaching Inventory , 2004 .

[55]  Ernest T. Pascarella,et al.  How college affects students , 1991 .

[56]  C. Foxcroft Reflections on Implementing the ITC's International Guidelines for Test Use. , 2001 .

[57]  N. Whitsed Learning and teaching. , 2003, Health information and libraries journal.

[58]  R. Johns,et al.  One Size Doesn’t Fit All: Selecting Response Scales For Attitude Items , 2005 .

[59]  Klaus Krippendorff,et al.  Content Analysis: An Introduction to Its Methodology , 1980 .

[60]  Michael Prosser,et al.  Confirmatory factor analysis of the Approaches to Teaching Inventory. , 2006, The British journal of educational psychology.

[61]  S. Haynes,et al.  Content validity in psychological assessment: A functional approach to concepts and methods. , 1995 .

[62]  George D. Kuh Assessing What Really Matters to Student Learning Inside The National Survey of Student Engagement , 2001 .