A Conceptual Framework for Studying the Sources of Variation in Program Effects

Evaluations of public programs in many fields reveal that different types of programs—or different versions of the same program—vary in their effectiveness. Moreover, a program that is effective for one group of people might not be effective for other groups, and a program that is effective in one set of circumstances may not be effective in other circumstances. This paper presents a conceptual framework for research on such variation in program effects and the sources of this variation. The framework is intended to help researchers—both those who focus mainly on studying program implementation and those who focus mainly on estimating program effects—see how their respective pieces fit together in a way that helps to identify factors that explain variation in program effects, and thereby support more systematic data collection. The ultimate goal of the framework is to enable researchers to offer better guidance to policymakers and program operators on the conditions and practices that are associated with larger and more positive effects.

[1]  B. Roberts,et al.  Back to the Future: Personality and Assessment and Personality Development. , 2009, Journal of research in personality.

[2]  P. Rothwell Subgroup analysis in randomised controlled trials: importance, indications, and interpretation , 2005, The Lancet.

[3]  P. Robins,et al.  The Distributional Impacts of Social Programs , 1997 .

[4]  Beth Boulay,et al.  Reading First Impact Study. Final Report. NCEE 2009-4038. , 2008 .

[5]  Andrew Booth,et al.  Implementation Science BioMed Central Debate A conceptual framework for implementation fidelity , 2007 .

[6]  J. Allen,et al.  An Interaction-Based Approach to Enhancing Secondary School Instruction and Student Achievement , 2011, Science.

[7]  Richard F. Elmore,et al.  Forward and Backword Mapping: Reversible Logic in the Analysis of Public Policy , 1985 .

[8]  Peter H. Rossi,et al.  Evaluating With Sense , 1983 .

[9]  G Chase,et al.  Implementing a human services program: how hard will it be? , 1979, Public policy.

[10]  Beth Boulay,et al.  Reading First Impact Study Final Report , 2009 .

[11]  Prentice Starkey,et al.  Effects of a Pre-Kindergarten Mathematics Intervention: A Randomized Experiment , 2008 .

[12]  J. Durlak,et al.  Implementation Matters: A Review of Research on the Influence of Implementation on Program Outcomes and the Factors Affecting Implementation , 2008, American journal of community psychology.

[13]  J. Kemple,et al.  A Regression-Based Strategy for Defining Subgroups in a Social Experiment , 2001 .

[14]  James J. Heckman,et al.  Micro Data, Heterogeneity and the Evaluation of Public Policy Part 1 , 2001 .

[15]  Jeanine Akers,et al.  National Criminal Justice Reference Service , 2013 .

[16]  L. Keele,et al.  A General Approach to Causal Mediation Analysis , 2010, Psychological methods.

[17]  Carl E. Van Horn,et al.  The Policy Implementation Process: A Conceptual Framework , 1975 .

[18]  D. Leonard Implementation as Mutual Adaptation of Technology and Organization , 2011 .

[19]  Jeffrey A. Smith,et al.  Heterogeneous Impacts in PROGRESA , 2008, SSRN Electronic Journal.

[20]  J. Neyman,et al.  Statistical Problems in Agricultural Experimentation , 1935 .

[21]  Howard S. Bloom,et al.  Linking program implementation and effectiveness: Lessons from a pooled sample of welfare‐to‐work experiments , 2003 .

[22]  N. Heather,et al.  Development of a short 'readiness to change' questionnaire for use in brief, opportunistic interventions among excessive drinkers. , 1992, British journal of addiction.

[23]  S. Raudenbush Advancing Educational Policy by Advancing Research on Instruction , 2008 .

[24]  C. Rouse,et al.  Rewarding Persistence: Effects of a Performance-Based Scholarship Program for Low-Income Parents , 2009 .

[25]  Brian Gill,et al.  Learning from Charter School Management Organizations: Strategies for Student Behavior and Teacher Coaching. Seattle, WA: University of Washington Bothell, The Center on Reinventing Public Education and Princeton, NJ: Mathematica Policy Research , 2012 .

[26]  Lashawn Richburg-Hayes,et al.  Paying for College Success: An Introduction to the Performance-Based Scholarship Demonstration. Policy Brief. , 2009 .

[27]  Chris Hull,et al.  Implementation Research as Empirical Constitutionalism , 1982 .

[28]  S. McConnell,et al.  The Effects of Building Strong Families: A Healthy Marriage and Relationship Skills Education Program for Unmarried Parents , 2012 .

[29]  C. Glymour,et al.  STATISTICS AND CAUSAL INFERENCE , 1985 .

[30]  Angela L. Duckworth,et al.  Grit: perseverance and passion for long-term goals. , 2007, Journal of personality and social psychology.

[31]  Andrew V. Dane,et al.  Program integrity in primary and early secondary prevention: are implementation effects out of control? , 1998, Clinical psychology review.

[32]  J. Angrist,et al.  Instrumental Variables Estimates of the Effect of Subsidized Training on the Quantiles of Trainee Earnings , 1999 .

[33]  R. Prinz,et al.  Treatment fidelity in outcome studies , 1991 .

[34]  J. Heckman,et al.  Making the Most out of Programme Evaluations and Social Experiments: Accounting for Heterogeneity in Programme Impacts , 1997 .

[35]  C. Weiss How Can Theory-Based Evaluation Make Greater Headway? , 1997 .

[36]  Geoffrey D. Borman,et al.  Final Reading Outcomes of the National Randomized Field Trial of Success for All , 2007 .

[37]  Sean F. Reardon,et al.  Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects , 2013 .

[38]  Mary G. Visher,et al.  Scaling up Learning Communities: The Experience of Six Community Colleges. Executive Summary. , 2010 .

[39]  Roberto Agodini,et al.  An Experimental Evaluation of Four Elementary School Math Curricula , 2010 .

[40]  Daniel A. Mazmanian,et al.  Implementation and public policy , 1983 .

[41]  E. Bettinger,et al.  The Role of Simplification and Information in College Decisions: Results from the H&R Block Fafsa Experiment , 2009 .

[42]  C. Henderson,et al.  Effects of Nurse Home Visiting on Maternal and Child Functioning: Age-9 Follow-up of a Randomized Trial , 2007, Pediatrics.

[43]  Chris Hulleman Whole-part-whole : Construct validity , measurement , and analytical issues for fidelity assessment in education , 2013 .

[44]  Howard S. Bloom,et al.  When is the Story in the Subgroups ? Strategies for Interpreting and Reporting Intervention Effects for Subgroups , 2011 .

[45]  Michelle P. Salyers,et al.  Measurement of Fidelity in Psychiatric Rehabilitation , 2000, Mental health services research.

[46]  W. Miller,et al.  Assessing drinkers' motivation for change: The Stages of Change Readiness and Treatment Eagerness Scale (SOCRATES). , 1996 .

[47]  Edward Pauly,et al.  From Welfare to Work. , 1993 .

[48]  Kenneth Fortson,et al.  Impacts of abstinence education on teen sexual activity, risk of pregnancy, and risk of sexually transmitted diseases. , 2008, Journal of policy analysis and management : [the journal of the Association for Public Policy Analysis and Management].

[49]  Andrew J. Mashburn,et al.  Measures of classroom quality in prekindergarten and children's development of academic, language, and social skills. , 2008, Child development.

[50]  Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects , 2014 .

[51]  J. Heckman Micro Data, Heterogeneity, and the Evaluation of Public Policy: Nobel Lecture , 2001, Journal of Political Economy.

[52]  Mathea Falco,et al.  A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. , 2003, Health education research.

[53]  M. Lipsey,et al.  Effects of Cognitive Behavioral Programs for Criminal Offenders , 2007 .

[54]  David S. Cordray,et al.  Treatment Strength and Integrity: Models and Methods. , 2006 .

[55]  A. Kazdin Comparative outcome studies of psychotherapy: methodological issues and strategies. , 1986, Journal of consulting and clinical psychology.

[56]  H. Bloom,et al.  When is the Story in the Subgroups? , 2013, Prevention Science.

[57]  R. Clifford,et al.  Early Childhood Environment Rating Scales, Third Edition (ECERS-3) , 2014 .

[58]  R. Quandt A New Approach to Estimating Switching Regressions , 1972 .

[59]  Diane Paulsell,et al.  Early Head Start Research: Pathways to Quality and Full Implementation in Early Head Start Programs. , 2002 .

[60]  James J. Heckman,et al.  1. The Scientific Model of Causality , 2005 .

[61]  D. Rubin Estimating causal effects of treatments in randomized and nonrandomized studies. , 1974 .

[62]  Leigh L. Linden,et al.  Impact Evaluation of Burkina Faso's BRIGHT Program: Design Report , 2009 .

[63]  Anthony S. Bryk,et al.  Toward a More Appropriate Conceptualization of Research on School Effects: A Three-Level Hierarchical Linear Model , 1988, American Journal of Education.

[64]  S. Raudenbush,et al.  Heterogeneous Agents, Social Interactions, and Causal Inference , 2013 .

[65]  Mark W. Lipsey,et al.  Evaluation: A Systematic Approach , 1979 .

[66]  C. Rohde,et al.  Impact of a statewide home visiting program to prevent child abuse. , 2007, Child abuse & neglect.

[67]  A. Bryk,et al.  Organizing Schools for Improvement: Lessons from Chicago , 2010 .

[68]  H. Bloom,et al.  Sustained Positive Effects on Graduation Rates Produced by New York City's Small Public High Schools of Choice. Policy Brief. , 2012 .

[69]  Gayle Hamilton,et al.  Moving People from Welfare to Work: Lessons from the National Evaluation of Welfare-to-Work Strategies. , 2002 .

[70]  Sean F. Reardon,et al.  Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects? , 2013 .

[71]  Cara Orfield,et al.  Impact Evaluation of Niger's IMAGINE Program , 2011 .

[72]  P Kinnersley,et al.  Methods of helping patients with behaviour change. , 1993, BMJ.

[73]  Neal Schmitt,et al.  The fidelity-adaptation debate: Implications for the implementation of public sector social programs , 1987 .

[74]  B. Flay,et al.  Effects of 2 prevention programs on high-risk behaviors among African American youth: a randomized trial. , 2004, Archives of pediatrics & adolescent medicine.

[75]  Vivian C. Wong,et al.  Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within‐study comparisons , 2008 .

[76]  Nadine Dechausay,et al.  Toward Reduced Poverty Across Generations: Early Findings from New York City’s Conditional Cash Transfer Program , 2010 .

[77]  Adrianna Kezar,et al.  What is the best way to achieve broader reach of improved practices in higher education? , 2011 .

[78]  A. Roy Some thoughts on the distribution of earnings , 1951 .

[79]  Donald B. Rubin,et al.  Bayesian Inference for Causal Effects: The Role of Randomization , 1978 .

[80]  Colleen Sommo,et al.  Getting Back on Track: Effects of a Community College Program for Probationary Students , 2009 .

[81]  Joseph A Durlak,et al.  A Meta-Analysis of After-School Programs That Seek to Promote Personal and Social Skills in Children and Adolescents , 2010, American journal of community psychology.

[82]  Eugene Bardach,et al.  The Implementation Game: What Happens After a Bill Becomes a Law. , 1977 .

[83]  J. Lowery,et al.  Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science , 2009, Implementation science : IS.

[84]  K. Kelleher,et al.  Assessing the Organizational Social Context (OSC) of Mental Health Services: Implications for Research and Practice , 2008, Administration and Policy in Mental Health and Mental Health Services Research.

[85]  Do Social Policy Reforms Have Different Impacts on Employment and Welfare Use as Economic Conditions Change , 2008 .

[86]  J. S. Hunter,et al.  Statistics for experimenters : an introduction to design, data analysis, and model building , 1979 .

[87]  Patricia Del Grosso,et al.  Home Visiting Evidence of Effectiveness Review: Executive Summary. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research and Evaluation , 2010 .

[88]  Oscar S. Cerna,et al.  Turning the Tide: Five Years of Achieving the Dream in Community Colleges , 2011 .

[89]  Charles Michalopoulos,et al.  What Works Best for Whom: Impacts of 20 Welfare-to-Work Programs by Subgroup. Executive Summary. National Evaluation of Welfare-to-Work Strategies. , 2000 .

[90]  Parag A. Pathak,et al.  Who Benefits from Kipp? , 2012, SSRN Electronic Journal.

[91]  Reshma Patel,et al.  Does More Money Matter? an Introduction to the Performance-Based Scholarship Demonstration in California , 2012 .

[92]  Philip Oreopoulos,et al.  The Role of Application Assistance and Information in College Decisions: Results from the H&R Block FAFSA Experiment* , 2012 .

[93]  Emmanuel Skoufias,et al.  PROGRESA and Its Impacts on the Welfare of Rural Households in Mexico , 2005 .

[94]  Michael J. Weiss,et al.  The Effects of Learning Communities for Students in Developmental Education A Synthesis of Findings from Six Community Colleges , 2012 .

[95]  David S. Cordray,et al.  Moving From the Lab to the Field: The Role of Fidelity and Achieved Relative Intervention Strength , 2009 .

[96]  Karen A. Blase,et al.  Implementation Research: A Synthesis of the , 2005 .

[97]  A. R. Zinsmeister,et al.  Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, by G. E. P. Box, W. G. Hunter, and J. S. Hunter , 1981 .

[98]  Benjamin L. Castleman,et al.  Stemming the Tide of Summer Melt: An Experimental Study of the Effects of Post-High School Summer Intervention on Low-Income Students’ College Enrollment , 2012 .

[99]  Colleen Sommo,et al.  What Can a Multifaceted Program Do for Community College Students: Early Results from an Evaluation of Accelerated Study in Associate Programs (ASAP) for Developmental Education Students , 2012 .

[100]  Patrick E. McKnight,et al.  Strengthening research methodology: Psychological measurement and evaluation. , 2006 .

[101]  Vijay N. Nair,et al.  A strategy for optimizing and evaluating behavioral interventions , 2005, Annals of behavioral medicine : a publication of the Society of Behavioral Medicine.

[102]  Kenneth F. Warren Implementation: How Great Expectations in Washington Are Dashed in Oakland; Or Why It's Amazing That Federal Programs Work at All . By Jeffrey L. Pressman and Aaron B. Wildavsky. (Berkeley: University of California Press, 1973. Pp. xviii, 182. $7.50.) , 1974 .

[103]  D. Katz The American Statistical Association , 2000 .

[104]  K. Dobson,et al.  The use of treatment manuals in cognitive therapy: experience and issues. , 1988, Journal of consulting and clinical psychology.

[105]  R. A. Fisher,et al.  Design of Experiments , 1936 .

[106]  H. Bloom,et al.  Household Participation in the Section 8 Existing Housing Program , 1981 .

[107]  S. Raudenbush,et al.  Statistical power and optimal design for multisite randomized trials. , 2000, Psychological methods.

[108]  D. Rubin Statistics and Causal Inference: Comment: Which Ifs Have Causal Answers , 1986 .

[109]  John G Bullock,et al.  Yes, But What's the Mechanism? (Don't Expect an Easy Answer) , 2010, Journal of personality and social psychology.