Are NAEP Executive Summary Reports Understandable to Policy Makers and Educators

The extent to which National Assessment of Educational Progress (NAEP) Executive Summary reports are intelligible to policy makers and educators was studied, and recommendations were offered for improving NAEP reporting practices. Detailed interviews were conducted with 59 policy makers and educators. In general, these two groups had considerable difficulty with the presentation of results in the Executive Summary report they were given. Misunderstandings and mistakes in reading the NAEP report were common. Many of the people interviewed had limited prior exposure to the NAEP, were unfamiliar with its reporting scale, and had a limited knowledge of statistics. These shortcomings contributed substantially to the problems encountered in reading the Executive Summary reports. Several recommendations are offered for report improvement. First, all displays of data should be field tested prior to their use in NAEP reports. A second recommendation is that NAEP reports for policy makers and educators should be simplified considerably. A third recommendation is that NAEP reports tailored to particular audiences may be needed to improve clarity, understandability, and usefulness. Appendixes display key tables and figures from the executive summary of the NAEP 1992 mathematics report card, describe participant characteristics, and present the interview protocol. (Contains 9 tables and27 references.) (SLD) ***********-AAAk**i.AAAAAAAA********************************************* Reproductions supplied by EDRS are the best that can be made from the original document. AAA******************************************************************** ARE NAEP EXECUTIVE SUMMARY REPORTS UNDERSTANDABLE TO POLICY MAKERS AND EDUCATORS? Ronald K. Hambleton and Sharon C. Slater University of Massachusetts at Amherst Abstract This research study is a follow-up to several recent studies conducted on NAEP reports that found policy makers and the media were misinterpreting text, figures, and tables. Our purposes were (1) to investigate the extent to which NAEP Executive Summary Reports are understandable to policy makers and educators, and (2) to the extent that problems are identified, to offer a set of recommendations for improving NAEP reporting practices.This research study is a follow-up to several recent studies conducted on NAEP reports that found policy makers and the media were misinterpreting text, figures, and tables. Our purposes were (1) to investigate the extent to which NAEP Executive Summary Reports are understandable to policy makers and educators, and (2) to the extent that problems are identified, to offer a set of recommendations for improving NAEP reporting practices. The main finding from this interview study with 59 policymakers and educators is that, in general, these two user groups of NAEP reports had considerable difficulty with the presentation of results in the NAEP Executive Summary Report they were given. Misunderstandings and mistakes in reading the NAEP Report were common. Many of the persons interviewed (1) had limited prior exposure to NAEP, (2) were unfamiliar with the NAEP reporting scale, and (3) had a limited knowledge of statistics. These shortcomings contributed substantially to the problems encountered in reading the NAEP Executive Summary Report. Several recommendations are offered for improving the NAEP reports: First, all displays of data should be field tested prior to their use in NAEP Executive Summary Reports. A second recommendation is that NAEP reports for policy makers and educators should be considerably simplified. A third recommendation is that NAEP reports tailored to particular audiences may be needed to improve clarity, understandability, and usefulness. U.S. DEPARTMENT OF EDUCATION Office of Educational Research end Improvement EDUC TIONAL RESOURCES INFORMATION CENTER (ERIC) his document has been reproduced as received from the person or organization originating it. Minor changes have been made to improve reproduction quality. Points of view or opinions stated in this document do not necessarily represent official OERI position or policy. PERMISSION TO REPRODUCE AND DISSEMINATE THIS MATERIAL HAS BEEN GRANTED BY R6 k. nsi.84 TO THE EDUCATIONAL RESOURCES INFORMATION CENTER (ERIC) BEST COPY AVAILABLE 2 ARE NAEP EXECUTIVE SUMMARY REPORTS UNDERSTANDABLE TO POLICY MAKERS AND EDUCATORS?1 Ronald K. Hambleton and Sharon C. Slater University of Massachusetts at Amherst Background The main purpose of the National Assessment of Educational Progress (NAEP) is to provide policy makers, educators, and the public with information about what students in the elementary, middle, and high schools know and can do, and to monitor any changes in student achievement over time. In view of the importance of NAEP data for effective educational policy-making and for informing the public about the status of education in America as well as the trends in educational achievement over time, considerable statistical and psychometric sophistication (the best that is available in the country) is used in test design, data collection, test data analysis, and scaling (see, for example, Beaton & Johnson, 1992; Johnson, 1992; Mislevy, Johnson, & Muraki, 1992). Considerably less attention in the NAEP design has been given to the ways in which data are organized and reported to NAEP audiences which include policy makers, educators, and the public though important progress has been made (Beaton & Allen, 1992). Item response theory (IRT) scaling, the use of anchor points and performance standards in score interpretations, and plausible values methodology for obtaining score distributions have been important in enhancing NAEP score reporting. Still, concerns about NAEP data reporting have become an issue in recent years, and were documented recently by Jaeger (1992), Koretz and Deibert (1993), Linn and Dunbar (1992), and Wainer (1994, 1995a, 1995b). Controversy, also, exists with respect to the proper interpretations of anchor levels and achievement levels (i.e. performance standards) which have become central concepts in NAEP reporting (American College Testing, 1993; Forsyth, 1991; Hambleton & Bourque, 1991; National Academy of Education, 1993; Stufflebeam, Jaeger, & Scriven, 1991). 'The work reported herein was supported under the National Center for Education Statistics Contract No. RS90159001 as administered by the Office of Educational Research and Improvement, U.S. Department of Education. The findings and opinions expressed in this report do not necessarily reflect the position or policies of the National Center for Education Statistics, the Office of Educational Research and Improvement or the U.S. Department of Education. The authors are pleased to acknowledge the constructive suggestions of Ray Fields, Mary Frase, Robert Linn, and Howard Wainer on an earlier draft of this report, and of Dan Koretz on the design of the study.

[1]  A. Beaton,et al.  Overview of the Scaling Methodology Used in the National Assessment. , 1992 .

[2]  R. Linn,et al.  Raising the Stakes of Test Administration: The Impact on Student Performance on NAEP. , 1993 .

[3]  Improvement,et al.  Adult literacy in America : a first look at the results of the National Adult Literacy Survey , 1993 .

[4]  R. Forsyth Do NAEP Scales Yield Valid Criterion‐Referenced Interpretations? , 1991 .

[5]  Nancy L. Allen,et al.  Interpreting scales through scale anchoring. , 1992 .

[6]  Howard Wainer,et al.  Understanding Graphs and Tables , 1992 .

[7]  Daniel Koretz,et al.  Interpretations of national assessment of educational progress (NAEP) anchor points and achievement levels by the print media in 1991 , 1993 .

[8]  Mary Lyn Bourque,et al.  The LEVELS of Mathematics Achievement: Initial Performance Standards for the 1990 NAEP Mathematics Assessment. Volume I: National and State Summaries. , 1991 .

[9]  Ina V. S. Mullis,et al.  The State of Mathematics Achievement: Naep's 1990 Assessment of the Nation and the Trial Assessment of the States , 1991 .

[10]  Interpreting NAEP Scales. , 1993 .

[11]  Eugene G. Johnson,et al.  Scaling Procedures in NAEP , 1992 .

[12]  Eugene G. Johnson The Design of the National Assessment of Educational Progress , 1992 .

[13]  Howard Wainer A Study of Display Methods for NAEP Results: I. Tables. Program Statistics Research. Technical Report No. 95-1. , 1995 .

[14]  William S. Cleveland The elements of graphing data , 1980 .

[15]  Stephen B. Dunbar,et al.  Issues in the Design and Reporting of the National Assessment of Educational Progress. , 1992 .

[16]  Gary T. Henry,et al.  Graphing Data: Techniques for Display and Analysis , 1994 .

[17]  H. Wainer,et al.  GRAPHICAL DATA ANALYSIS , 1981 .