Reasons for Using Mixed Methods in the Evaluation of Complex Projects

Evaluations of development projects are conducted to assess their net effectiveness and, by extension, to guide decisions regarding the merits of scaling-up successful projects and/or replicating them elsewhere. The key characteristics of ‘complex’ interventions – numerous face- to-face interactions, high discretion, imposed obligations, pervasive unknowns – rarely fit neatly into standard evaluation protocols, requiring the deployment of a wider array of research methods, tools and theory. The careful use of such ‘mixed methods’ approaches is especially important for discerning the conditions under which ‘successful’ projects of all kinds might be expanded or adopted elsewhere. These claims, and the practical implications to which they give rise, draw on an array of recent evaluations in different sectors in development.

[1]  G. Mwabu,et al.  Scaling Up What Works: Experimental Evidence on External Validity in Kenyan Education , 2013 .

[2]  M. Woolcock,et al.  Integrating Qualitative Methods into Investment Climate Impact Evaluations , 2014 .

[3]  David McDonald,et al.  Realworld Evaluation: Working under Budget, Time, Data, and Political Constraints , 2007 .

[4]  D. Anthony Evidence-based Policy: A Realist Perspective , 2007 .

[5]  M. Woolcock,et al.  Conditional, unconditional and everything in between: a systematic review of the effects of cash transfer programmes on schooling outcomes , 2014 .

[6]  M. Ravallion Evaluation in the Practice of Development , 2008 .

[7]  M. Ravallion The Mystery of the Vanishing Benefits: An Introduction to Impact Evaluation , 2001 .

[8]  Paul Shaffer Against Excessive Rhetoric in Impact Assessment: Overstating the Case for Randomised Controlled Experiments , 2011 .

[9]  Vicki L. Plano Clark,et al.  Research Questions in Mixed Methods Research , 2010 .

[10]  Michael J. Reiss Humanity in a Creative Universe. By Stuart A. Kauffman. Oxford and New York: Oxford University Press. $34.95. xvii + 294 p.; ill.; index. ISBN: 978-0-19-939045-8. 2016. , 2018, The Quarterly Review of Biology.

[11]  Jesko Hentschel,et al.  Contextuality and data collection methods: A framework and application to health service utilisation , 1999 .

[12]  R. Hyman Quasi-Experimentation: Design and Analysis Issues for Field Settings (Book) , 1982 .

[13]  Lant Pritchett,et al.  Building State Capability: Evidence, Analysis, Action , 2017 .

[14]  Ghazala Mansuri,et al.  Localizing Development: Does Participation Work? , 2012 .

[15]  A. Ruzzene Drawing Lessons from Case Studies by Enhancing Comparability , 2012 .

[16]  V. Rao,et al.  The Anatomy of Failure: An Ethnography of a Randomized Trial to Deepen Democracy in Rural India , 2014 .

[17]  A. Mattoo,et al.  Are the Benefits of Export Support Durable? Evidence from Tunisia , 2013 .

[18]  Charles Teddlie,et al.  Mixed Methods Sampling A Typology With Examples , 2016 .

[19]  M. Radeny,et al.  Understanding poverty dynamics in Kenya , 2010 .

[20]  James Mahoney,et al.  A tale of two cultures: qualitative and quantitative research in the social sciences and Social science concepts: a user's guide and Explaining war and peace: case studies and necessary condition counterfactuals and Case studies, causal mechanisms, and selecting cases , 2015 .

[21]  M. Shanahan Dynamical complexity in small-world networks of spiking neurons. , 2008, Physical review. E, Statistical, nonlinear, and soft matter physics.

[22]  M. Woolcock,et al.  Contesting Development: Participatory Projects and Local Conflict Dynamics in Indonesia , 2011 .

[23]  D. McKenzie,et al.  Does Management Matter? Evidence from India , 2011 .

[24]  Edward Miguel,et al.  Reshaping Institutions: Evidence on Aid Impacts Using a Pre-Analysis Plan , 2011 .

[25]  V. Rao Experiments in ‘ Participatory Econometrics ’ Improving the Connection between Economic Analysis and the Real World , 2006 .

[26]  Andrew Bennett,et al.  Process Tracing and Causal Inference , 2010 .

[27]  Gary King,et al.  Improving Anchoring Vignettes Designing Surveys to Correct Interpersonal Incomparability , 2010 .

[28]  Valerie J. Caracelli,et al.  Toward a Conceptual Framework for Mixed-Method Evaluation Designs , 1989 .

[29]  Michael Woolcock,et al.  Using case studies to explore the external validity of ‘complex’ development interventions , 2013 .

[30]  David Collier,et al.  Understanding Process Tracing , 2011, PS: Political Science & Politics.

[31]  Michael Woolcock,et al.  Toward a plurality of methods in project evaluation: a contextualised approach to understanding impact trajectories and efficacy , 2009 .

[32]  Charles C. Ragin,et al.  The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. , 1990 .

[33]  V. Rao,et al.  Integrating Qualitative and Quantitative Approaches in Program Evaluation , 2003 .

[34]  Niall Keleher,et al.  Conditional Cash Transfers: Reducing Present and Future Poverty , 2009 .

[35]  Michael Woolcock,et al.  How and Why Does History Matter for Development Policy? , 2010 .

[36]  J. Morduch,et al.  The Impact of Microcredit on the Poor in Bangladesh: Revisiting the Evidence , 2009 .

[37]  S. Khandker,et al.  The impact of Group‐Based Credit Programs on Poor Households in Bangladesh: Does the Gender of Participants Matter? , 1998, Journal of Political Economy.

[38]  Michael Woolcock,et al.  Using Mixed Methods in Monitoring and Evaluation: Experiences from International Development , 2010 .

[39]  Christopher Woodruff,et al.  The Demand for, and Consequences of, Formalization Among Informal Firms in Sri Lanka , 2012, SSRN Electronic Journal.

[40]  M. Woolcock,et al.  Legal Pluralism and Equity: Some Reflections on Land Reform in Cambodia , 2008 .

[41]  Nancy Cartwright,et al.  Evidence-Based Policy: A Practical Guide to Doing It Better , 2012 .

[42]  David Byrne,et al.  Evaluating complex social interventions in a complex world , 2013 .