Getting to the Bottom Line: A Method for Synthesizing Findings Within Mixed-method Program Evaluations

Evaluators who are concerned more with pragmatics than with competing epistemologies have brought multi- and mixed-method evaluations into common practice. Program evaluators commonly use multiple methods and mixed data to capture both the breadth and depth of information pertaining to the evaluand, and to strengthen the validity of findings. However, multiple or mixed methods may yield incongruent results, and evaluators may find themselves reporting seemingly conflicting findings to program staff, policy makers, and other stakeholders. Our purpose is to offer a method for synthesizing findings within multi- or mixed-method evaluations to reach defensible evaluation (primarily summative) conclusions. The proposed method uses a set of criteria and analytic techniques to assess the worth of each data source or type and to establish what each says about program effect. Once on a common scale, simple math allows synthesis across data sources or types. The method should prove a useful tool for evaluators.

[1]  Mixed-Method Evaluation: Developing Quality Criteria through Concept Mapping , 1994 .

[2]  Valerie J. Caracelli,et al.  Toward a Conceptual Framework for Mixed-Method Evaluation Designs , 1989 .

[3]  Valerie J. Caracelli,et al.  Advances in mixed-method evaluation : the challenges and benefits of integrating diverse paradigms , 1997 .

[4]  Michael Montagne Evaluating Educational Programs , 1982 .

[5]  John E. Hunter,et al.  Correcting for sources of artificial variation across studies. , 1994 .

[6]  G. Glass Primary, Secondary, and Meta-Analysis of Research1 , 1976 .

[7]  Gretchen B. Rossman,et al.  Combining Quantitative and Qualitative Methods in a Single Large-Scale Evaluation Study , 1985 .

[8]  M. Trend,et al.  On The Reconciliation of Qualitative and Quantitative Analyses: A Case Study , 1978 .

[9]  Jennifer Caroline Greene,et al.  Data Analysis Strategies for Mixed-Method Evaluation Designs , 1993 .

[10]  Donna M. Mertens,et al.  Research methods in education and psychology : integratingdiversity with quantitative & qualitative approaches , 1998 .

[11]  James R. Sanders,et al.  The program evaluation standards : how to assess evaluations of educational programs , 1994 .

[12]  M. Fine,et al.  Qualitative and quantitative methods: When stories converge , 1987 .

[13]  Melvin M. Mark,et al.  Improving Inferences from Multiple Methods. , 1987 .

[14]  Riki Savaya,et al.  Mixed method evaluation: A case study , 1997 .

[15]  Jennifer Caroline Greene,et al.  Defining and describing the paradigm issue in mixed‐method evaluation , 1997 .

[16]  Michael Scriven,et al.  An Introduction to Meta-Evaluation. , 1969 .

[17]  Larry V. Hedges,et al.  A Practical Guide to Modern Methods of Meta-Analysis. , 1989 .

[18]  N. Denzin The Logic of Naturalistic Inquiry , 1971 .

[19]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[20]  Abbas Tashakkori,et al.  Mixed Methodology: Combining Qualitative and Quantitative Approaches , 1998 .

[21]  Chen-Lin C. Kulik,et al.  The concept of meta-analysis , 1989 .

[22]  T. Jick Mixing Qualitative and Quantitative Methods: Triangulation in Action. , 1979 .

[23]  Michael Quinn Patton,et al.  How to use qualitative methods in evaluation , 1987 .