Evaluating social policies : principles and U.S. experience

Invariably, studies, proposals, and plans for social programs contain a strong recommendation for evaluating and monitoring. Reliable information about what works and why is clearly vital for improving existing programs or designing future ones. Making such assessments requires effective methods of evaluation. Policymakers who use these evaluations need to know about the methods - the pitfalls to watch for and the relative advantages and disadvantages of different techniques in different situations. This article describes these evaluation methods and the experience accumulated in the United States in applying them in practice.

[1]  Orley Ashenfelter,et al.  Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs , 1984 .

[2]  J. Heckman Sample selection bias as a specification error , 1979 .

[3]  Burt S. Barnow,et al.  The Impact of CETA Programs on Earnings: A Review of the Literature , 1987 .

[4]  Susan A. Stephens,et al.  The evaluation of the National Long Term Care Demonstration. , 1986, Health services research.

[5]  James J. Heckman,et al.  Randomization and Social Policy Evaluation , 1991 .

[6]  Charles F. Manski,et al.  Evaluating Welfare and Training Programs. , 1994 .

[7]  James J. Heckman,et al.  Longitudinal Analysis of Labor Market Data , 1985 .

[8]  Mark W. Lipsey,et al.  Evaluation: A Systematic Approach , 1979 .

[9]  James J. Heckman,et al.  Alternative methods for evaluating the impact of interventions: An overview , 1985 .

[10]  D. Kershaw,et al.  The New Jersey income-maintenance experiment , 1976 .

[11]  Howard S. Bloom,et al.  The National JTPA Study: Title II-A Impacts on Earnings and Employment at 18 Months. Executive Summary. , 1992 .

[12]  James J. Heckman,et al.  Do We Need Experimental Data To Evaluate the Impact of Manpower Training On Earnings? , 1987 .

[13]  J. Heckman,et al.  Longitudinal Analysis of Labor Market Data: Alternative methods for evaluating the impact of interventions , 1985 .

[14]  R. Berk,et al.  A Time Series Analysis of the Impact of a Water Conservation Campaign , 1978 .

[15]  J. Schore,et al.  Medicaid costs and birth outcomes: the effects of prenatal WIC participation and the use of prenatal care. , 1992, Journal of policy analysis and management : [the journal of the Association for Public Policy Analysis and Management].

[16]  T. Cook,et al.  Quasi-experimentation: Design & analysis issues for field settings , 1979 .

[17]  Terry R. Johnson,et al.  AN ANALYSIS OF THE IMPACT OF CETA PROGRAMS ON PARTICIPANTS' EARNINGS , 1986 .

[18]  John Conlisk,et al.  Choice of Response Functional Form in Designing Subsidy Experiments , 1973 .

[19]  Robert A. Moffitt,et al.  Program Evaluation With Nonexperimental Data , 1991 .

[20]  R. Lalonde Evaluating the Econometric Evaluations of Training Programs with Experimental Data , 1984 .

[21]  Robert L,et al.  How Precise Are Evaluations of Employment and Training Programs , 1987 .