A new perspective for mixed-methods evaluations

Evaluation is a broad and multifaceted area of study for researchers that requires the use of a combination of research methods—subsequently providing a challenging agenda for policy-makers. The multifaceted nature of the field indicates an urgent need for a comprehensive framework that includes guidelines for researchers and practitioners. There are several important factors in the selection of method(s) for evaluations including an evaluation's paradigm, approach and purpose; this article aims to address each of these and propose a new perspective for evaluations using a mixed-methods approach. Specifically, the article aims to inform a framework to help evaluators select appropriate methods, with special emphasis on the structural components of the working methodologies. Given the significance of the theoretical background of evaluation, a literature review was carried out with the purpose of informing the development of a new perspective for mixed-methods evaluations using a meta-synthesis approach. All related literature including books and papers published in peer-reviewed journals was extracted, studied and analysed via open coding. Thus, the new perspective presented in this article is based on rich and reliable literature and is helpful for evaluators who are interested in using a mixed-methods approach.

[1]  Ben R. Martin,et al.  Evaluation of Moroccan research using a bibliometric-based approach: investigation of the validity of the h-index , 2009, Scientometrics.

[2]  Zsuzsanna Trón Evaluation Methods of European Regional Policy and Reasons for Different Outcomes , 2009 .

[3]  Jacques Wainer,et al.  Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists , 2013, Scientometrics.

[4]  Jennifer Caroline Greene,et al.  Defining and describing the paradigm issue in mixed‐method evaluation , 1997 .

[5]  Vijay K. Vaishnavi,et al.  Design Science Research Methods and Patterns: Innovating Information and Communication Technology , 2007 .

[6]  Davinia Palomares Montero,et al.  What are the key indicators for evaluating the activities of universities , 2011 .

[7]  Massimo Franceschet,et al.  The first Italian research assessment exercise: A bibliometric perspective , 2009, J. Informetrics.

[8]  Frances P Lawrenz,et al.  The Archipelago Approach To Mixed Method Evaluation , 2002 .

[9]  R. Escritt Welcoming and opening remarks: Science and the academic system in transition—The role of evaluation , 2006, Scientometrics.

[10]  Gary T. Henry,et al.  Why not use? , 2000 .

[11]  Jennifer Caroline Greene,et al.  Mixed Methods in Social Inquiry , 2007 .

[12]  John W. Creswell,et al.  Research Design: Qualitative, Quantitative, and Mixed Methods Approaches , 2010 .

[13]  Adela García-Aracil,et al.  Analysis of the evaluation process of the research performance: An empirical case , 2006, Scientometrics.

[14]  Anthony F. J. van Raan Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups , 2013, Scientometrics.

[15]  Jiang Li,et al.  Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments , 2010, J. Informetrics.

[16]  John W. Creswell,et al.  Designing and Conducting Mixed Methods Research , 2006 .

[17]  Drs. Jon Mikel Zabala Iturriagagoitia,et al.  REGIONAL SCIENCE , TECHNOLOGY AND INNOVATION POLICY EVALUATION : APPROACHES AND METHODS , 2006 .

[18]  Valerie J. Caracelli,et al.  The expanding scope of evaluation use , 2000 .

[19]  Guan Jiancheng,et al.  Evaluation and interpretation of knowledge production efficiency , 2004 .

[20]  S. Bench,et al.  The user experience of critical care discharge: a meta-synthesis of qualitative research. , 2010, International journal of nursing studies.

[21]  Daniel L. Stufflebeam,et al.  Evaluation Models: New Directions for Evaluation , 2001 .

[22]  Souraya Sidani,et al.  Handbook for Synthesizing Qualitative Research , 2008 .

[23]  Payam Hanafizadeh,et al.  An evaluation scheme for nanotechnology policies , 2011 .

[24]  M. Scriven Types of Evaluation and Types of Evaluator , 1996 .

[25]  N. Mackenzie,et al.  Research dilemmas: Paradigms, methods and methodology , 2006 .

[26]  Lisa M. Dillman Comparing evaluation activities across multiple theories of practice. , 2013, Evaluation and program planning.

[27]  S Hassfeld,et al.  EVALUATION OF MODELS , 2002, Biomedizinische Technik. Biomedical engineering.

[28]  K. Galt Evaluating Health Interventions: An Introduction to Evaluation of Health Treatments, Services, Policies and Organizational Interventions , 1998 .

[29]  Miguel P Caldas,et al.  Research design: qualitative, quantitative, and mixed methods approaches , 2003 .

[30]  Kurt C Stange,et al.  Publishing Multimethod Research , 2006, The Annals of Family Medicine.

[31]  Chris L. S. Coryn,et al.  Evaluation Theory, Models, and Applications , 2007 .

[32]  Lela V. Zimmer,et al.  Qualitative meta-synthesis: a question of dialoguing with texts. , 2006, Journal of advanced nursing.

[33]  Liliana Rodríguez-Campos,et al.  Advances in collaborative evaluation. , 2012, Evaluation and program planning.

[34]  Michael Quinn Patton,et al.  Evaluation, Knowledge Management, Best Practices, and High Quality Lessons Learned , 2001 .

[35]  Jason Palmer Public policy and program evaluation , 1999 .

[36]  J. Fitzpatrick Exemplars as Case Studies: Reflections on the Links Between Theory, Practice, and Context , 2004 .

[37]  Jiancheng Guan,et al.  Evaluation and interpretation of knowledge production efficiency , 2004, Scientometrics.

[38]  B. Rabie,et al.  A new typology of monitoring and evaluation approaches , 2009 .

[39]  Erjia Yan Topic-based Pagerank: toward a topic-level scientific evaluation , 2014, Scientometrics.