Integrating Curriculum, Instruction, Assessment, and Evaluation in a Technology-Supported Genetics Learning Environment

This article describes an extended collaboration between a development team and an evaluation team working with GenScope, an open-ended exploratory software tool. In some respects, this was a routine evaluation, documenting substantial gains (of roughly 1 SD) in genetics reasoning ability in all but 1 of 17 classes, despite challenges presented by school computer-lab settings. Relative to matched comparison classes, larger gains were found in technical biology and general science courses but not in college prep or honors biology courses. In other respects, our effort illustrates the value of new views of assessment, technology, and research. The alignment of a sophisticated research assessment and simple classroom assessments shed light on initial failures, spurring revision. By refining the GenScope activities and extending the classroom assessments, we supported worthwhile whole-class discourse around the shared understanding of the software. A follow-up study in a laptop-equipped classroom yielded the absolute and relative gains (3.1 SD and 1.6 SD) that proponents of such innovations have long promised. In retrospect, the strengths and weakness of the study illustrate the value of newer “design-based” approaches to educational research.

[1]  John R. Jungck,et al.  Strategic Simulations and Post-Socratic Pedagogy: Constructing Computer Software to Develop Long-Term Inference through Experimental Inquiry , 1985 .

[2]  James A. Stewart,et al.  Potential learning outcomes from solving genetics problems: A typology of problems , 1988 .

[3]  J. Frederiksen,et al.  A Systems Approach to Educational Testing , 1989 .

[4]  D. Hunt Cognition and Learning , 1989 .

[5]  J. Stewart,et al.  High school students' problem‐solving performance on realistic genetics problems , 1990 .

[6]  Howard Gardner,et al.  Chapter 2: To Use Their Minds Well: Investigating New Forms of Student Assessment , 1991 .

[7]  Howard Gardner,et al.  To Use Their Minds Well: Investigating New Forms of Student Assessment , 1991 .

[8]  A. diSessa Local sciences: viewing the design of human-computer systems as cognitive science , 1991 .

[9]  John M. Carroll,et al.  Designing Interaction: Psychology at the Human-Computer Interface , 1991 .

[10]  Allan Collins,et al.  Toward a Design Science of Education , 1992 .

[11]  Abbie Brown,et al.  Design experiments: Theoretical and methodological challenges in creating complex interventions in c , 1992 .

[12]  E. Finkel,et al.  Science as Model Building: Computers and High-School Genetics , 1992 .

[13]  Lee Sechrest,et al.  Roots: Back to Our First Generations , 1992 .

[14]  G. Wiggins Assessment: Authenticity, Context, and Validity. , 1993 .

[15]  L. Shepard Chapter 9: Evaluating Test Validity , 1993 .

[16]  R. Sternberg,et al.  Transfer on Trial: Intelligence, Cognition and Instruction , 1993 .

[17]  Dorothy L. Gabel,et al.  Handbook of Research on Science Teaching and Learning Project. , 1993 .

[18]  Ann C. H. Kindfield Understanding a basic biological process: Expert and novice models of meiosis , 1994 .

[19]  Ann C.H. Kindfield,et al.  Biology Diagrams: Tools to Think With , 1994 .

[20]  S. Messick The Interplay of Evidence and Consequences in the Validation of Performance Assessments , 1994 .

[21]  J. Shea National Science Education Standards , 1995 .

[22]  J. Lagowski National Science Education Standards , 1995 .

[23]  J. Greeno,et al.  Transfer of situated learning , 1996 .

[24]  Paul Horwitz,et al.  Teaching science at multiple space time scales , 1996, CACM.

[25]  John D. Bransford,et al.  The Jasper Project: Lessons in Curriculum, Instruction, Assessment, and Professional Development , 1997 .

[26]  J. Greeno THE SITUATIVITY OF KNOWING, LEARNING, AND RESEARCH , 1998 .

[27]  E. Chelimsky The Role of Experience in Formulating Theories of Evaluation Practice , 1998 .

[28]  Samuel Richmond Report to the President on the Use of Technology to Strengthen K-12 Education in the United States , 1998 .

[29]  E. Chelimsky The Role of Experience in Formulating Theories of Evaluation Practice , 1998 .

[30]  Daniel T. Hickey,et al.  Assessing Learning in a Technology-Supported Genetics Environment: Evidential and Systemic Validity Issues , 1998 .

[31]  J. Frederiksen,et al.  Inquiry, Modeling, and Metacognition: Making Science Accessible to All Students , 1998 .

[32]  Daniel T. Hickey,et al.  Tools for Scaffolding Inquiry in the Domain of Introductory Genetics , 1999 .

[33]  Paul Horwitz,et al.  Advancing Educational Theory by Enhancing Practice in a Technology-Supported Genetics Learning Environment. , 1999 .

[34]  Russell G. Almond,et al.  A cognitive task analysis with implications for designing simulation-based performance assessment☆ , 1999 .

[35]  Mary M. Kennedy,et al.  Approximations to Indicators of Student Outcomes , 1999 .

[36]  Paul Horwitz,et al.  Hypermodels: Embedding Curriculum and Assessment in Computer-based Manipulatives. , 1999 .

[37]  Lee S. Shulman,et al.  Issues in Education Research: Problems and Possibilities. , 1999 .

[38]  Mary Ann Christie "We Understood It More 'Cause We Were Doin' It Ourself": Students' Self-Described Connections between Participation and Learning , 1999 .

[39]  Ann L. Brown,et al.  How people learn: Brain, mind, experience, and school. , 1999 .

[40]  Mark Wilson,et al.  From Principles to Practice: An Embedded Assessment System , 2000 .

[41]  L. Shepard The Role of Assessment in a Learning Culture , 2000 .

[42]  Erik De Corte,et al.  Marrying theory building and the improvement of school practice: a permanent challenge for instructional psychology , 2000 .

[43]  Jeremy Roschelle,et al.  Technology Design as Educational Research: Interweaving Imagination, Inquiry, and Impact , 2000 .

[44]  P. Black,et al.  Classroom Assessment and the National Science Education Standards. , 2001 .

[45]  R. Glaser,et al.  Knowing What Students Know: The Science and Design of Educational Assessment , 2001 .

[46]  Ginette Delandshere,et al.  Assessment as Inquiry. , 2002 .

[47]  Susan R. Goldman,et al.  Comment: Be Careful What You Wish For—You May Get It: Educational Research in the Spotlight , 2002 .

[48]  R. Shavelson,et al.  On the evaluation of systemic science education reform: Searching for instructional sensitivity , 2002 .

[49]  Richard J. Shavelson,et al.  Scientific Culture and Educational Research , 2002 .

[50]  J. Bligh Improving student learning , 2002, Medical education.

[51]  D. Hickey Engaged Participation versus Marginal Nonparticipation: A Stridently Sociocultural Approach to Acheivement Motivation , 2003, The Elementary School Journal.

[52]  Finbarr C. Sloane,et al.  Exploring Modeling Aspects of Design Experiments , 2003 .

[53]  Design-Based Research: An Emerging Paradigm for Educational Inquiry , 2003 .

[54]  Anthony E. Kelly,et al.  Theme Issue: The Role of Design in Educational Research , 2003 .

[55]  L. Schauble,et al.  Design Experiments in Educational Research , 2003 .

[56]  Joanne Lobato,et al.  How Design Experiments Can Inform a Rethinking of Transfer andViceVersa , 2003 .