Education Production and Incentives∗

The substantial ‘value-added’ literature that seeks to measure the overall impact of teachers on student achievement does not distinguish between teacher effects that are invariant to prevailing incentives and those that are responsive to them. In contrast, we develop an empirical approach that, for the first time, allows us to separate out incentive-varying teacher effort from incentiveinvariant teacher ability, and further, to explore whether the effects of effort and ability persist differentially. Our strategy exploits exogenous variation in the incentive strength of a well-known federal accountability scheme, along with rich administrative data covering all public school students in North Carolina. We separately identify teacher effort and teacher ability to determine their relative magnitudes contemporaneously, finding that a one standard deviation increase in teacher ability is equivalent to 21 percent of a standard deviation increase in student test scores, while an analogous change in teacher effort accounts for 8 percent of such an increase. We then use prior incentive strength to reject the hypothesis that the persistence of teacher ability and effort is similar. To supplement our regression-based evidence, we set out a complementary structural estimation procedure, showing that effort affects future scores less than ability. From a policy perspective, our results indicate that incentives matter when measuring teacher valueadded. Our analysis also has implications for the cost effectiveness of sharpening incentives relative to altering the distribution of teacher ability across classrooms and schools.

[1]  Jesse M. Rothstein Revisiting the Impacts of Teachers , 2017 .

[2]  R. McMillan,et al.  Incentive Design in Education: An Empirical Analysis , 2015 .

[3]  Thomas J. Kane,et al.  Making Decisions with Imprecise Performance Measures , 2015 .

[4]  Michael F. Lovenheim,et al.  Incentive Strength and Teacher Productivity: Evidence from a Group-Based Teacher Incentive Pay System , 2012, Review of Economics and Statistics.

[5]  Thomas J. Kane,et al.  Validating Teacher Effect Estimates Using Changes in Teacher Assignments in Los Angeles , 2014 .

[6]  C. Kirabo Jackson,et al.  Teacher Effects and Teacher-Related Policies , 2014 .

[7]  Robert C. Pianta,et al.  Designing Teacher Evaluation Systems: New Guidance from the Measures of Effective Teaching Project , 2014 .

[8]  Raj Chetty,et al.  Measuring the Impacts of Teachers I: Evaluating Bias in Teacher Value-Added Estimates , 2014 .

[9]  Matthew Wiswall The Dynamics of Teacher Quality , 2013 .

[10]  Jesse Rothstein,et al.  Teacher Quality Policy When Supply Matters , 2012 .

[11]  Michael Hansen,et al.  Is it Just a Bad Class? Assessing the Long‐Term Stability of Estimated Teacher Performance , 2012 .

[12]  E. Hanushek The Economic Value of Higher Teacher Quality , 2010 .

[13]  Daniel F. McCaffrey,et al.  A Review of Stata Routines for Fixed Effects Estimation in Normal Linear Models , 2010 .

[14]  Helen F. Ladd,et al.  Status versus Growth: The Distributional Effects of School Accountability Policies. Working Paper. , 2009 .

[15]  Jesse Rothstein,et al.  Teacher Quality in Educational Production: Tracking, Decay, and Student Achievement , 2008 .

[16]  Randall Reback Teaching to the rating: School accountability and the distribution of student achievement , 2008 .

[17]  Helen F. Ladd,et al.  Teacher-Student Matching and the Assessment of Teacher Effectiveness , 2006, The Journal of Human Resources.

[18]  A. Rizvi,et al.  Assessment Of Teacher Effectiveness , 2006 .

[19]  Petra E. Todd,et al.  On the Specification and Estimation of the Production Function for Cognitive Achievement , 2003 .

[20]  E. Hanushek,et al.  Teachers, Schools, and Academic Achievement , 1998 .

[21]  Hammond,et al.  Making Decisions with Imprecise Performance Measures: the Relationship between Annual Student Achievement Gains and a Teacher's Career Value‐added , 2022 .