How Unstable are ‘School Effects’ Assessed by a Value-Added Technique?

This paper re-considers the widespread use of value-added approaches to estimate school ‘effects’, and shows the results to be very unstable over time. The paper uses as an example the contextualised value-added scores of all secondary schools in England. The study asks how many schools with at least 99% of their pupils included in the VA calculations, and with data for all years, had VA measures that were clearly positive for five years. The answer is - none. Whatever it is that VA is measuring, if it is measuring anything at all, it is not a consistent characteristic of schools. To find no schools with five successive years of positive VA means that parents could not use it as a way of judging how well their primary age children would do at age 16 in their future secondary school. Contextualised value-added (CVA) is used here for the calculations because there is good data covering five years that allows judgement of its consistency as a purported school characteristic. However, what is true of CVA is almost certainly true of VA approaches more generally, whether for schools, colleges, departments or individual teachers, in England and everywhere else. Until their problems have been resolved by further development to handle missing and erroneous data, value-added models should not be used in practice. Commentators, policy-makers, educators and families need to be warned. If value-added scores are as meaningless as they appear to be, there is a serious ethical issue wherever they have been or continue to be used to reward and punish schools or make policy decisions.

[1]  Andrew Ray,et al.  Value Added in English Schools , 2009, Education Finance and Policy.

[2]  S. Gorard Value‐added is of little value , 2006 .

[3]  M. D. Alberto C. Serrano,et al.  Fifteen Thousand Hours: Secondary Schools and Their Effects on Children , 1981 .

[4]  S. Gorard Now You See it, Now You don't: School Effectiveness as Conjuring? , 2011 .

[5]  A. Amrein-Beardsley Methodological Concerns About the Education Value-Added Assessment System , 2008 .

[6]  S. Gorard The value‐added of primary schools: what is it really measuring? , 2008 .

[7]  C. Lubienski,et al.  School Sector and Academic Achievement: A Multilevel Analysis of NAEP Mathematics Data , 2006 .

[8]  Seán Kelly,et al.  Overcoming the Volatility in School-Level Gain Scores: A New Approach to Identifying Value Added With Cross-Sectional Data , 2007 .

[9]  H. Goldstein,et al.  The limitations of using school league tables to inform school choice , 2009 .

[10]  S. Gorard Serious doubts about school effectiveness , 2010 .

[11]  Daniel F. McCaffrey,et al.  The Intertemporal Variability of Teacher Effect Estimates , 2009, Education Finance and Policy.

[12]  F. Coffield Why the McKinsey reports will not improve school systems , 2012 .

[13]  M. Fullan,et al.  How the world ’ s best-performing school systems come out on top September , 2007 .

[14]  Henry Braun,et al.  Review of McKinsey report: How the world’s best performing school systems come out on top , 2008 .

[15]  Comparability of examination standards between subjects: an international perspective , 2009 .

[16]  J. Holechek,et al.  What's the trend? , 2001 .

[17]  Changes in Examination Performance in English Secondary Schools over the Course of a Decade: Searching for Patterns and Trends Over Time , 2005 .

[18]  James C. Robinson,et al.  League tables and school effectiveness: a mathematical model , 2003, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[19]  J. Mangan,et al.  What's in a Trend? A Comment on Gray, Goldstein and Thomas (2001), 'Predicting the Future: the role of past performance in determining trends in institutional effectiveness at A level' , 2003 .