Extending the reach of randomized social experiments: new directions in evaluations of American welfare‐to‐work and employment initiatives

Summary. Random assignment experiments are widely used in the USA to test the effectiveness of new social interventions. This paper discusses several major welfare-to-work experiments, highlighting their evolution from simple `black box' tests of single interventions to multigroup designs used to compare alternative interventions or to isolate the effects of components of an intervention. The paper also discusses new efforts to combine experimental and non-experimental analyses to test underlying programme theories and to maximize the knowledge gained about the effectiveness of social programmes. Researchers and policy makers in other countries may find this variety of approaches useful to consider as they debate an expanded role for social experiments.

[1]  R. Hollister,et al.  Problems in the Evaluation of Community-Wide Initiatives , 1995 .

[2]  H. Bloom,et al.  Measuring Program Impacts on Earnings and Employment: Do Unemployment Insurance Wage Reports from Employers Agree with Surveys of Individuals? , 1999, Journal of Labor Economics.

[3]  David A. Jaeger,et al.  Problems with Instrumental Variables Estimation when the Correlation between the Instruments and the Endogenous Explanatory Variable is Weak , 1995 .

[4]  Howard S. Bloom,et al.  Accounting for No-Shows in Experimental Evaluation Designs , 1984 .

[5]  Wen-Ling Lin,et al.  The GAIN Evaluation: Five-Year Impacts on Employment, Earnings, and AFDC Receipt , 1996 .

[6]  H. Bloom Measuring the Impacts of Whole-School Reforms: Methodological Lessons from an Evaluation of Accelerated Schools. MDRC Working Papers on Research Methodology. , 2001 .

[7]  S. Raudenbush Statistical analysis and optimal design for cluster randomized trials , 1997 .

[8]  David H. Greenberg,et al.  Evaluating Government Training Programs for the Economically Disadvantaged , 1997 .

[9]  James A. Riccio,et al.  Understanding Best Practices for Operating Welfare-To-Work Programs , 1996 .

[10]  Anthony S. Bryk,et al.  Hierarchical Linear Models: Applications and Data Analysis Methods , 1992 .

[11]  T. Cook,et al.  Quasi-experimentation: Design & analysis issues for field settings , 1979 .

[12]  L. Delbeke Quasi-experimentation - design and analysis issues for field settings - cook,td, campbell,dt , 1980 .

[13]  Edward Pauly,et al.  From Welfare to Work. , 1993 .

[14]  K. Connell,et al.  New approaches to evaluating community initiatives: Theory, measurement, and analysis , 1998 .

[15]  David Card,et al.  Using Geographic Variation in College Proximity to Estimate the Return to Schooling , 1993 .

[16]  Robert F. Boruch,et al.  Randomized Experiments for Planning and Evaluation: A Practical Guide , 1998 .

[17]  P. Rossi : Five Years After: The Long-Term Effects of Welfare-to-Work Programs , 1997 .

[18]  Yeheskel Hasenfeld,et al.  Enforcing a Participation Mandate in a Welfare-to-Work Program , 1996, Social Service Review.

[19]  Howard S. Bloom,et al.  The National JTPA Study: Title II-A Impacts on Earnings and Employment at 18 Months. Executive Summary. , 1992 .

[20]  G. Hamilton,et al.  The Impact of a Continuous Participation Obligation in a Welfare Employment Program , 1996 .

[21]  J. Riccio GAIN: Benefits, Costs, and Three-Year Impacts of a Welfare-to-Work Program. California's Greater Avenues for Independence Program. , 1994 .

[22]  D. Dorling,et al.  Statistics in Society , 1998 .

[23]  Joshua D. Angrist,et al.  Identification of Causal Effects Using Instrumental Variables , 1993 .

[24]  J. Kemple Florida's Project Independence. Benefits, Costs, and Two-Year Impacts of Florida's JOBS Program. , 1995 .

[25]  Robinson G. Hollister,et al.  Youth employment and training programs : the YEDPA years , 1985 .

[26]  Johanna Walter,et al.  Evaluating Two Approaches to Case Management: Implementation, Participation Patterns, Costs, and Three-Year Impacts of the Columbus Welfare-to-Work Program. National Evaluation of Welfare-to-Work Strategies. , 2001 .

[27]  Howard S. Bloom,et al.  Modeling the Performance of Welfare-to-Work Programs: The Effects of Program Management and Services, Economic Environment, and Client Characteristics. MDRC Working Papers on Research Methodology. , 2001 .

[28]  L. Gennetian,et al.  The Los Angeles Jobs-First GAIN Evaluation: Final Report on a Work First Program in a Major Urban Center , 2000 .

[29]  Kathryn Parker Boudett,et al.  Does Mandatory Basic Education Improve Achievement Test Scores of Afdc Recipients? , 1997 .

[30]  Lucy J. Miller,et al.  Randomized experiments for planning and evaluation : a practical guide , 1998 .

[31]  P J Hannan,et al.  Assessing intervention effects in the Minnesota Heart Health Program. , 1994, American journal of epidemiology.

[32]  R. Rumberger Hierarchical linear models: Applications and data analysis methods: and. Newbury Park, CA: Sage, 1992. (ISBN 0-8039-4627-9), pp. xvi + 265. Price: U.S. $45.00 (cloth) , 1997 .

[33]  L. Gennetian,et al.  Reforming Welfare and Rewarding Work: Final Report on the Minnesota Family Investment Program, Vol. , 2000 .

[34]  Cynthia Miller,et al.  Reforming Welfare and Rewarding Work: A Summary of the Final Report on the Minnesota Family Investment Program , 2000 .

[35]  Ian Plewis Modelling impact heterogeneity , 2002 .

[36]  R. MacCoun Experimental and Quasi‐Experimental Designs for Generalized Causal Inference, by William R. Shadish, Thomas D. Cook, and Donald T. Campbell. Boston: Houghton Mifflin, 2001, 623 pp., $65.56. , 2003 .

[37]  J M Bos,et al.  Using Cluster Random Assignment to Measure Program Impacts , 1999, Evaluation review.

[38]  Peter H. Rossi,et al.  Evaluating With Sense , 1983 .

[39]  J. Riccio Mobilizing Public Housing Communities for Work: Origins and Early Accomplishments of the Jobs-Plus Demonstration. , 1999 .