Improving Science Assessments by Situating Them in a Virtual Environment

Current science assessments typically present a series of isolated fact-based questions, poorly representing the complexity of how real-world science is constructed. The National Research Council asserts that this needs to change to reflect a more authentic model of science practice. We strongly concur and suggest that good science assessments need to consist of several key factors: integration of science content with scientific inquiry, contextualization of questions, efficiency of grading and statistical validity and reliability. Through our Situated Assessment using Virtual Environments for Science Content and inquiry (SAVE Science) research project, we have developed an immersive virtual environment to assess middle school children’s understanding of science content and processes that they have been taught through typical classroom instruction. In the virtual environment, participants complete a problem-based assessment by exploring a game world, interacting with computer-based characters and objects, collecting and analyzing possible clues to the assessment problem. Students can solve the problems situated in the virtual environment in multiple ways; many of these are equally correct while others uncover misconceptions regarding inference-making. In this paper, we discuss stage one in the design and assessment of our project, focusing on our design strategies for integrating content and inquiry assessment and on early implementation results. We conclude that immersive virtual environments do offer the potential for creating effective science assessments based on our framework and that we need to consider engagement as part of the framework.

[1]  William H. DuBay The Principles of Readability. , 2004 .

[2]  David Klahr,et al.  The Psychology of Scientific Thinking: Implications for Science Teaching and Learning , 2006 .

[3]  Helen L. Gibson,et al.  Longitudinal impact of an inquiry‐based science program on middle school students' attitudes toward science , 2002 .

[4]  H. Schweingruber,et al.  America's lab report : investigations in high school science , 2006 .

[5]  Joel Michael,et al.  Conceptual assessment in the biological sciences: a National Science Foundation-sponsored workshop. , 2007, Advances in physiology education.

[6]  S. Barab,et al.  Eat Your Vegetables and Do Your Homework: A Design­ Based Investigation of Enjoyment and Meaning in Learning , 2005 .

[7]  D. Ketelhut The Impact of Student Self-efficacy on Scientific Inquiry Skills: An Exploratory Investigation in River City, a Multi-user Virtual Environment , 2007 .

[8]  Etienne Wenger,et al.  Situated Learning: Legitimate Peripheral Participation , 1991 .

[9]  Marlow Ediger,et al.  Teaching Science as Inquiry. , 1998 .

[10]  Brian C. Nelson Exploring the Use of Individualized, Reflective Guidance In an Educational Multi-User Virtual Environment , 2007 .

[11]  W. L. Eikenberry The national association for research in science teaching , 1929 .

[12]  Gail P. Baxter,et al.  What We've Learned about Assessing Hands-On Science , 1992 .

[13]  Jerry Wellington,et al.  America's lab report: Investigations in high school science , 2007 .

[14]  S. Southerland,et al.  Chapter 2 Resisting Unlearning: Understanding Science Education’s Response to the United States’s National Accountability Movement , 2007 .

[15]  V. Shute,et al.  Melding the Power of Serious Games and Embedded Assessment to Monitor and Foster Learning: Flow and Grow , 2009 .

[16]  L. Resnick,et al.  Assessing the Thinking Curriculum: New Tools for Educational Reform , 1992 .

[17]  Leonard A. Annetta,et al.  Is inquiry possible in light of accountability?: A quantitative comparison of the relative effectiveness of guided inquiry and verification laboratory instruction , 2010 .

[18]  Robert H. Tai,et al.  Planning Early for Careers in Science , 2006, Science.

[19]  David A. Gillam,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2012 .

[20]  The Death of Science?: What We Risk in Our Rush toward Standardized Testing and the Three R'S , 2002 .

[21]  Ronald D. Anderson Reforming Science Teaching: What Research Says About Inquiry , 2002 .

[22]  Brian C. Nelson,et al.  Scientific Inquiry in Educational Multi-user Virtual Environments , 2007 .

[23]  P. Black,et al.  Classroom Assessment and the National Science Education Standards. , 2001 .

[24]  R. Evans,et al.  Reliability of the "Draw-a-Man" test. , 1975 .

[25]  Brian C. Nelson,et al.  Designing for Real-World Inquiry in Virtual Environments , 2006 .

[26]  Stephen P. Klein,et al.  The Cost of Science Performance Assessments in Large-Scale Testing Programs , 1997 .

[27]  Brian C. Nelson,et al.  Global channels of evidence for learning and assessment in complex game environments , 2011, Br. J. Educ. Technol..

[28]  A. Collins,et al.  Situated Cognition and the Culture of Learning , 1989 .

[29]  Joseph Krajcik,et al.  Inquiry-based science in the middle grades: Assessment of learning in urban systemic reform , 2004 .

[30]  Ann Harlow,et al.  Why Students Answer TIMSS Science Test Items the Way They Do , 2004 .

[31]  Secondary Education. Office for Career Massachusetts science and technology/engineering curriculum framework , 2006 .

[32]  W. H. Leonard,et al.  Performance Assessment of a Standards-Based High School Biology Curriculum , 2001 .

[33]  Katherine L. McNeill,et al.  Learning‐goals‐driven design model: Developing curriculum materials that align with national standards and incorporate project‐based pedagogy , 2008 .