Direct and Unobtrusive Measures of Informal STEM Education Outcomes

The free‐choice nature of informal STEM education (ISE) makes rigorous and contextually appropriate evaluation of outcomes challenging. Traditional measures such as surveys and interviews have been widely used in ISE evaluations, but they have limitations: They are typically self‐reports that are susceptible to the reactive effects of measurement, and they tend to intrude upon the participant's learning experience. The ISE field needs measures that capture outcomes in more direct and less obtrusive ways, permitting triangulation with multiple measures on outcomes. In this chapter, we define what we mean by direct and unobtrusive measures, and we discuss the feasibility and future of using such measures in ISE evaluations by drawing on examples from the field. We include a case study in which we adapt a school‐based performance assessment to embed into the informal learning experiences of participants at a STEM tinkering workshop; and we highlight successes, challenges, and implications for the future.

[1]  R. Nisbett,et al.  Undermining children's intrinsic interest with extrinsic reward: A test of the "overjustification" hypothesis. , 1973 .

[2]  C. Sneider,et al.  Summative Evaluation of a Participatory Science Exhibit. , 1979 .

[3]  C. Achilles,et al.  Evaluation: A Systematic Approach , 1980 .

[4]  S. Bitgood,et al.  PRINCIPLES OF EXHIBIT DESIGN , 1987 .

[5]  R. Shavelson Performance Assessment in Science , 1991 .

[6]  M. A. Scheirer,et al.  Guiding Principles for Evaluators , 1995 .

[7]  Beverly Serrell,et al.  Exhibit Labels: An Interpretive Approach , 1996 .

[8]  B. Serrell Paying Attention: Visitors and Museum Exhibitions , 1998 .

[9]  K. Taylor,et al.  Accessing Problem-Solving Strategy Knowledge: The Complementary Use of Concurrent Verbal Protocols and Retrospective Debriefing. , 2000 .

[10]  E. Deci,et al.  Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. , 2000, The American psychologist.

[11]  Mark Wilson,et al.  From Principles to Practice: An Embedded Assessment System , 2000 .

[12]  R. Shavelson,et al.  ON THE “EXCHANGEABILITY” OF HANDS-ON AND COMPUTER- SIMULATED SCIENCE PERFORMANCE ASSESSMENTS , 2000 .

[13]  W. Shadish,et al.  Experimental and Quasi-Experimental Designs for Generalized Causal Inference , 2001 .

[14]  R. Glaser,et al.  Knowing What Students Know: The Science and Design of Educational Assessment , 2001 .

[15]  Y. Lincoln,et al.  Scientific Research in Education , 2004 .

[16]  K. Crowley,et al.  Learning conversations in museums , 2002 .

[17]  R. Almond,et al.  Focus Article: On the Structure of Educational Assessments , 2003 .

[18]  Doris B. Ash Dialogic inquiry in life science conversations of family groups in a museum , 2003 .

[19]  Jacqueline P. Leighton Avoiding Misconception, Misuse, and Missed Opportunities: The Collection of Verbal Reports in Educational Achievement Testing , 2005 .

[20]  K. A. Ericsson,et al.  Protocol Analysis and Expert Thought: Concurrent Verbalizations of Thinking during Experts' Performance on Representative Tasks , 2006 .

[21]  Pamela R. Aschbacher,et al.  Fifth Graders' Science Inquiry Abilities: A Comparative Study of Students in Hands-On and Textbook Curricula. , 2006 .

[22]  Miki K. Tomita,et al.  On the Impact of Curriculum-Embedded Formative Assessment on Learning: A Collaboration between Curriculum and Assessment Developers , 2008 .

[23]  R. Shavelson,et al.  Direct Measures in Environmental Education Evaluation: Behavioral Intentions versus Observable Actions , 2009 .

[24]  G. Hein Learning Science in Informal Environments: People, Places, and Pursuits , 2009 .

[25]  Kerry Bronnenkant,et al.  Timing and Tracking: Unlocking Visitor Behavior , 2009 .

[26]  Daniel L. Schwartz,et al.  Measuring What Matters Most: Choice-Based Assessments for the Digital Age , 2013 .

[27]  Janice D. Gobert,et al.  From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining , 2013, Journal of the Learning Sciences.

[28]  Jay P. Greene,et al.  Learning to Think Critically , 2014 .

[29]  Joshua P. Gutwill,et al.  Research to Practice: Observing Learning in Tinkering Activities , 2015 .

[30]  J. Lemke,et al.  Documenting and Assessing Learning in Informal and Media-Rich Environments , 2015 .

[31]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[32]  K. Peterman,et al.  Embedded Assessment as an Essential Method for Understanding Public Engagement in Citizen Science , 2016 .

[33]  R. Shavelson,et al.  Room for Rigor: Designs and Methods in Informal Science Education Evaluation , 2016 .

[34]  Matthew Berland,et al.  Modeling Visitor Behavior in a Game-Based Engineering Museum Exhibit with Hidden Markov Models , 2016, EDM.

[35]  Leilah Lyons,et al.  DCLM framework: understanding collaboration in open-ended tabletop learning environments , 2017, Int. J. Comput. Support. Collab. Learn..

[36]  K. Peterman,et al.  Exploring Embedded Assessment to Document Scientific Inquiry Skills Within Citizen Science , 2017 .

[37]  R. Shavelson,et al.  Assessment of Learning Outcomes in Higher Education , 2017 .

[38]  Matthew Berland,et al.  Constructivist Analytics: Using Data to Enable Deeper Museum Experiences for More Visitors—Lessons from the Learning Sciences , 2017 .

[39]  David Reitter,et al.  Toward Cognitively Constrained Models of Language Processing: A Review , 2017, Front. Commun..

[40]  Lærke Mygind,et al.  Reviewing Automated Sensor-Based Visitor Tracking Studies: Beyond Traditional Observational Methods? , 2017 .

[41]  N. Nolan From Nice to Necessary: Lessons from Identifying and Supporting Productive STEM Programs in Out-of-school Settings , 2017 .

[42]  S. Spicer The nuts and bolts of evaluating science communication activities. , 2017, Seminars in cell & developmental biology.

[43]  A. Grand,et al.  What Works in the Field? Evaluating Informal Science Events , 2017, Front. Commun..

[44]  R. Shavelson,et al.  International Performance Assessment of Learning in Higher Education (iPAL): Research and Development , 2018 .