What Doesn’t Work? Three Failures, Many Answers

Current debates on impact evaluation have addressed the question ‘what works and what doesn’t?’ mainly focussing on methodology failures when providing evidence of impact. In order to answer that question, this article contrasts different approaches to evaluation in terms of the way they address different kinds of possible failures. First, there is more to be debated than simply methodological failures: there are also programme theory failures and implementations failures. Moreover, not all methodological failures are a simple matter of selection bias. Second, the article reviews issues that have recently been raised within different approaches relative to each failure. For programme theory failure, it is a matter of complexity and providing rival explanations; for implementation failure: how to use guidelines, and how to take context into account; and for methodology failure: how to move from internal to external validity, and to syntheses, within the framework of ‘situational responsiveness’. All these issues disclose a terrain for potential exchange between the protagonists of different approaches to impact evaluation.

[1]  Michael Scriven,et al.  Demythologizing Causation and Evidence , 2009 .

[2]  Değerlendirme ve Sonuç,et al.  Glossary of Key Terms in Evaluation and Results-based Management in Sustainable Development (Second Edition) , 2023 .

[3]  Patricia J. Rogers Implications of Complicated and Complex Characteristics for Key Tasks in Evaluation , 2011 .

[4]  Thomas D. Cook,et al.  Towards a practical theory of external validity , 2000 .

[5]  Donald T. Campbell,et al.  Relabeling Internal and External Validity for Applied Social Scientists. , 1986 .

[6]  C. Hood,et al.  The limits of administration , 1976 .

[7]  D. Anthony Evidence-based Policy: A Realist Perspective , 2007 .

[8]  Maxim J. Schlossberg,et al.  Utilizing Prior Research in Evaluation Planning. New Directions for Program Evaluation, No. 27, September 1985. , 1986 .

[9]  Gordon Stobart,et al.  Evaluation: The state of the art and the sorry state of the science , 1985 .

[10]  S. Glouberman Complicated and Complex Systems: What Would Successful Reform of Medicare Look Like? , 2002 .

[11]  Christina A. Christie,et al.  Evaluation Roots: Tracing Theorists′ Views and Influences , 2004 .

[12]  Robert Chambers,et al.  Designing impact evaluations: different perspectives , 2012 .

[13]  Michael Montagne Evaluating Educational Programs , 1982 .

[14]  H. White,et al.  A Contribution to Current Debates in Impact Evaluation , 2010 .

[15]  H. London,et al.  Social experimentation , 1993 .

[16]  M. Scriven Evaluation thesaurus, 4th ed. , 1991 .

[17]  Jennifer Greene The Educative Evaluator: An Interpretation of Lee J. Cronbach's Vision of Evaluation , 2004 .

[18]  F. Barca,et al.  AN AGENDA FOR A REFORMED COHESION POLICY , 2009 .

[19]  Enver Solmon,et al.  The policy making process , 2008 .

[20]  Carol H. Weiss,et al.  Theory‐based evaluation: Past, present, and future , 1997 .

[21]  D. Campbell Reforms as experiments , 1969 .

[22]  O. Feinstein Review: Realistic Evaluation , 1998 .

[23]  A Martini,et al.  Valutare il successo delle politiche pubbliche , 2009 .

[24]  Melvin M. Mark,et al.  Validity typologies and the logic and practice of quasi‐experimentation , 1986 .

[25]  B E Puetz,et al.  Evaluation research. , 1982, Journal of continuing education in nursing.

[26]  Valerie J. Caracelli,et al.  Toward a Conceptual Framework for Mixed-Method Evaluation Designs , 1989 .

[27]  L. Cronbach,et al.  Designing evaluations of educational and social programs , 1983 .

[28]  Christina A. Christie,et al.  What Counts as Credible Evidence in Applied Research and Evaluation Practice , 2008 .

[29]  Pat R I C I,et al.  Rogers : Using Programme Theory to Evaluate Complicated 29 Using Programme Theory to Evaluate Complicated and Complex Aspects of Interventions , 2007 .

[30]  Nancy Kingsbury Program Evaluation: A Variety of Rigorous Methods Can Help Identify Effective Interventions. Report to Congressional Requesters. GAO-10-30. , 2009 .

[31]  Jason T. Burkhardt,et al.  What Counts as Credible Evidence in Applied Research and Evaluation Practice? , 2009, Journal of MultiDisciplinary Evaluation.

[32]  Michael Quinn Patton,et al.  A context and boundaries for a theory-driven approach to validity , 1989 .

[33]  Patricia J. Rogers,et al.  Using Programme Theory to Evaluate Complicated and Complex Aspects of Interventions , 2008 .

[34]  Patricia J. Rogers Book Review: Realistic Evaluation , 1999 .

[35]  Jean Hartley,et al.  Case study research , 2004 .

[36]  D. A. Kenny,et al.  The moderator-mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. , 1986, Journal of personality and social psychology.

[37]  Fadhel Kaboub Realistic Evaluation , 2004 .

[38]  K. Underhill,et al.  Independent living programmes for improving outcomes for young people leaving the care system. , 2006, The Cochrane database of systematic reviews.