When a “sloppy copy” is good enough: Results of a state writing assessment

Abstract Students in grades 5 and 8 completed a state writing assessment, and their first and final drafts on the extended writing portion of the test were copied and scored using the state writing rubric. The rubric consisted of three primary traits: Content and Organization, Style and Fluency, and Language Use. Scorers were blind to the study purpose and scored either a student's first or final draft. No significant difference was found between the first and final drafts written by students in special education at both grade levels. Likewise, no significant difference was found for the writing of general education students in grade 8. A significant difference was found, however, between first and final drafts written by fifth-grade students in general education. Cross tabulations conducted at grades 5 and 8 revealed that over 50% of the first drafts received the same score or a better score than what was earned on the final draft.

[1]  R. Shavelson Performance Assessment in Science , 1991 .

[2]  D. Gordon Rohman,et al.  PRE-WRITING, THE CONSTRUCTION AND APPLICATION OF MODELS FOR CONCEPT FORMATION IN WRITING. , 1964 .

[3]  S. Witte,et al.  Direct assessments of writing: Substance and romance , 1998 .

[4]  S D Imber,et al.  Some conceptual and statistical issues in analysis of longitudinal psychiatric data. Application to the NIMH treatment of Depression Collaborative Research Program dataset. , 1993, Archives of general psychiatry.

[5]  Robert L. Linn,et al.  Educational Assessment: Expanded Expectations and Challenges , 1993 .

[6]  Alija Kulenović,et al.  Standards for Educational and Psychological Testing , 1999 .

[7]  D. W. Zimmerman,et al.  GAIN SCORES IN RESEARCH CAN BE HIGHLY RELIABLE , 1982 .

[8]  Anmarie Eves-Bowden What Basic Writers Think about Writing. , 2001 .

[9]  R. Shavelson,et al.  Sampling Variability of Performance Assessments. , 1993 .

[10]  Test-Takers' Judgments of Essay Prompts: Perceptions and Performance , 1999 .

[11]  Daniel Koretz,et al.  The Validity of Gains in Scores on the Kentucky Instructional Results Information System (KIRIS). , 1998 .

[12]  J. Hayes,et al.  A Cognitive Process Theory of Writing , 1981, College Composition & Communication.

[13]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[14]  Daniel F. McCaffrey,et al.  The Realiability of Mathematics Portfolio Scores: Lessons From the Vermont Experience , 1995 .

[15]  Charles A. MacArthur,et al.  Knowledge of Revision and Revising Behavior among Students with Learning Disabilities , 1991 .

[16]  Charles A. MacArthur,et al.  Effects of goal setting and procedural facilitation on the revising behavior and writing performance of students with writing and learning problems. , 1995 .

[17]  George Hillocks,et al.  The Testing Trap: How State Writing Assessments Control Learning , 2002 .

[18]  S. Graham,et al.  Executive control in the revising of students with learning and writing difficulties. , 1997 .

[19]  Lindsay Clare Matsumura,et al.  Teacher Feedback, Writing Assignment Quality, and Third-Grade Students' Revision in Lower-And Higher-Achieving Urban Schools , 2002, The Elementary School Journal.

[20]  Charles A. MacArthur,et al.  A Peer Editor Strategy: Guiding Learning-Disabled Students in Response and Revision , 1993, Research in the Teaching of English.

[21]  Willa Wolcott,et al.  An Overview of Writing Assessment: Theory, Research, and Practice , 1998 .

[22]  John Tisak,et al.  Defending and Extending Difference Score Methods , 1994 .

[23]  Robert Helwig,et al.  Writing Performance Assessments , 2004, Journal of learning disabilities.

[24]  Belita Gordon,et al.  Conceptual issues in equating performance assessments : lessons from writing assessment , 1996 .