Relationships Between Quantitative Measures of Evaluation Plan and Program Model Quality and a Qualitative Measure of Participant Perceptions of an Evaluation Capacity Building Approach

Despite a heightened emphasis on building evaluation capacity and evaluation quality, there is a lack of tools available to identify high-quality evaluation. In the context of testing the Systems Evaluation Protocol (SEP), quantitative rubrics were designed and tested to assess the quality of evaluation plans and models. Interview data were also collected and analyzed using a priori codes. A mixed methods approach was used to synthesize quantitative and qualitative data and explore trends. Consistencies between data types were found for attitude and capacity, and disconnects were found for knowledge, cyberinfrastructure, time, and quality. This approach to data integration represents a novel way to tap the generative potential of divergence that arises when different methods produce contradictory results.

[1]  Jean A. King,et al.  A Checklist for Building Organizational Evaluation Capacity 1 , 2007 .

[2]  Juhani Vaivio,et al.  Interviews – Learning the Craft of Qualitative Research Interviewing , 2012 .

[3]  Leslie J. Cooksy,et al.  Influences on Evaluation Quality , 2012 .

[4]  Abraham Wandersman,et al.  A Research Synthesis of the Evaluation Capacity Building Literature , 2012 .

[5]  Jennifer Caroline Greene,et al.  The generative potential of mixed methods inquiry1 , 2005 .

[6]  Jennifer Caroline Greene,et al.  Mixed Methods in Social Inquiry , 2007 .

[7]  S. Hesse-Biber Feminist Approaches to Triangulation , 2012 .

[8]  Lynne C. Huffman,et al.  Learning Culture and Outcomes Measurement Practices in Community Agencies , 2002 .

[9]  Evaluating Performance Measurement Systems in Nonprofit Agencies: The Program Accountability Quality Scale (PAQS) , 2000 .

[10]  J. Greene Is Mixed Methods Social Inquiry a Distinctive Methodology? , 2008 .

[11]  D. Cicchetti Guidelines, Criteria, and Rules of Thumb for Evaluating Normed and Standardized Assessment Instruments in Psychology. , 1994 .

[12]  Daniel L. Stufflebeam,et al.  Foundational Models for 21st Century Program Evaluation , 2000 .

[13]  Sophie Tessier,et al.  From Field Notes, to Transcripts, to Tape Recordings: Evolution or Combination? , 2012 .

[14]  Abbas Tashakkori,et al.  A general typology of research designs featuring mixed methods. , 2006 .

[15]  Nigel Fielding,et al.  Triangulation and Mixed Methods Designs , 2012 .

[16]  S. Hesse-Biber,et al.  Triangulation and Mixed Methods Research , 2012 .

[17]  Hallie Preskill,et al.  A Multidisciplinary Model of Evaluation Capacity Building , 2008 .

[18]  D. Mertens Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods , 1997 .

[19]  J. Fleiss,et al.  Intraclass correlations: uses in assessing rater reliability. , 1979, Psychological bulletin.