Exploring evaluator perceptions of the characterization of impact under the REF2014

The relative newness of ‘impact’ as a criterion for research assessment has meant that there is yet to be an empirical study examining the process of its evaluation. This article is part of a broader study which is exploring the panel-based peer and end-user review process for societal impact evaluation using the UK’s national research assessment exercise, the Research Excellence Framework (REF) 2014, as a case study. In particular, this article explores the different perceptions REF2014 evaluators had regarding societal impact, preceding their evaluation of this measure as part of REF2014. Data are drawn from 62 interviews with evaluators from the health-related Panel A and its subpanels, prior to the REF2014 exercise taking place. We show how going into the REF exercise, evaluators from Panel A had different perceptions about how to characterize impact and how to define impact realization in terms of research outcomes and the research process. We conclude by discussing the implications of our findings for future impact evaluation frameworks, as well as postulating a series of hypotheses about the ways in which evaluators’ different perceptions going into an impact assessment could potentially influence the evaluation of impact submissions. Using REF2014 as a case study, these hypotheses will be tested in interviews with REF2014 evaluators post-assessment.

[1]  Steve Hanney,et al.  Desarrollo y aplicación del Modelo Payback para la evaluación del impacto socioeconómico de la investigación en salud , 2008 .

[2]  Richard A Collins,et al.  A systematic evaluation of payback of publicly funded health and health services research in Hong Kong , 2007, BMC Health Services Research.

[3]  J. Spaapen,et al.  Introducing ‘productive interactions’ in social impact assessment , 2011 .

[4]  Tim C. E. Engels,et al.  The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science , 2011 .

[5]  Steven Wooding,et al.  Proposed methods for reviewing the outcomes of health research: the impact of funding by the UK's 'Arthritis Research Campaign' , 2004, Health research policy and systems.

[6]  Lutz Bornmann,et al.  Scientific Peer Review: An Analysis of the Peer Review Process from the Perspective of Sociology of Science Theories , 2008 .

[7]  Ellen McIntyre,et al.  The feasibility of determining the impact of primary health care research projects using the Payback Framework , 2009, Health research policy and systems.

[8]  Thamar Klein,et al.  Triangulation of Qualitative and Quantitative Methods in Panel Peer Review Research , 2011 .

[9]  J. Brewer,et al.  The impact of impact , 2011 .

[10]  J Raftery,et al.  An assessment of the impact of the NHS Health Technology Assessment Programme. , 2007, Health technology assessment.

[11]  Linda Butler,et al.  Capturing Research Impacts: A Review of International Practice. Documented Briefing. , 2010 .

[12]  Steven Wooding,et al.  Capturing Research Impacts , 2010 .

[13]  Stephen Hanney,et al.  The ‘Payback Framework’ explained , 2011 .

[14]  R. Frodeman,et al.  Peer review and the ex ante assessment of societal impacts , 2011 .

[15]  Lutz Bornmann,et al.  What is societal impact of research and how can it be assessed? a literature survey , 2013, J. Assoc. Inf. Sci. Technol..

[17]  Martin Buxton,et al.  The payback of ‘Payback’: challenges in assessing research impact , 2011 .

[18]  Edward Nason,et al.  Health research: measuring the social, health and economic benefits , 2009, Canadian Medical Association Journal.

[19]  Michèle Lamont,et al.  Comparing Customary Rules of Fairness: Evaluative Practices in Various Types of Peer Review Panels , 2011 .

[20]  A. Strauss,et al.  The discovery of grounded theory: strategies for qualitative research aldine de gruyter , 1968 .

[21]  K. Charmaz,et al.  Constructing Grounded Theory: A practical guide through qualitative analysis Kathy Charmaz Constructing Grounded Theory: A practical guide through qualitative analysis Sage 224 £19.99 0761973532 0761973532 [Formula: see text]. , 2006, Nurse researcher.

[22]  Cassidy R. Sugimoto,et al.  Bias in peer review , 2013, J. Assoc. Inf. Sci. Technol..

[23]  B. Martin The Evolution of Science Policy and Innovation Studies , 2012 .

[24]  C. Donovan,et al.  Evaluation of the impact of National Breast Cancer Foundation‐funded research , 2014, The Medical journal of Australia.

[25]  Deborah Cox,et al.  Understanding societal impact through productive interactions: ICT research as a case , 2014 .

[26]  J. Aked Pathways to impact , 2016 .

[27]  P. V. D. Besselaar,et al.  Evaluation of research in context - an approach and two cases , 2011 .

[28]  Lutz Bornmann,et al.  Scientific peer review , 2011, Annu. Rev. Inf. Sci. Technol..

[29]  Andrew Webster,et al.  New medical technologies and society , 2004 .

[30]  Charles M Camic,et al.  Social knowledge in the making , 2011 .

[31]  C. Donovan,et al.  The Australian Research Quality Framework: A live experiment in capturing the social, economic, environmental, and cultural returns of publicly funded research , 2008 .

[32]  J. Molas-Gallart,et al.  Tracing productive interactions to identify social impacts: an example from the social sciences , 2011 .

[33]  Lutz Bornmann,et al.  Panel peer review of grant applications: what do we know from research in social psychology on judgment and decision-making in groups? , 2010 .

[34]  Anselm L. Strauss,et al.  Qualitative Analysis For Social Scientists , 1987 .

[35]  Ulf Sandström,et al.  Cognitive Bias in Peer Review : a New Approach , 2009 .

[36]  R. Scoble,et al.  Assessment, evaluations, and definitions of research impact: A review , 2014 .

[37]  Ingeborg Meijer,et al.  Societal output and use of research performed by health research groups , 2010, Health research policy and systems.

[38]  Liv Langfeldt,et al.  The policy challenges of peer review: managing bias, conflict of interests and interdisciplinary assessments , 2006 .

[39]  Martin Buxton,et al.  Developing and applying a framework for assessing the payback from medical research , 2010 .

[40]  Liv Langfeldt,et al.  The Decision-Making Constraints and Processes of Grant Peer Review, and Their Effects on the Review Outcome , 2001, Peer review in an Era of Evaluation.

[41]  Sheila Wilson,et al.  Research Excellence Framework , 2013 .

[42]  M. Buxton,et al.  Payback arising from research funding: evaluation of the Arthritis Research Campaign. , 2005, Rheumatology.

[43]  Lutz Bornmann,et al.  How should the societal impact of research be generated and measured? A proposal for a simple and practicable approach to allow interdisciplinary comparisons , 2013, Scientometrics.

[44]  Peter van den Besselaar,et al.  The selection of talent as a group process. A literature review on the social dynamics of decision making in grant panels , 2014 .

[45]  T. Campbell,et al.  A framework to measure the impact of investments in health research , 2006 .

[46]  Steve Hanney,et al.  How Can Payback from Health Services Research Be Assessed? , 1996, Journal of health services research & policy.

[47]  Simon Chapman,et al.  Galvanizers, guides, champions, and shields: the many ways that policymakers use public health researchers. , 2011, The Milbank quarterly.

[48]  G Stephane Philogene,et al.  An evaluation of the Mind-Body Interactions and Health Program: assessing the impact of an NIH program using the Payback Framework , 2011 .

[49]  Steven Wooding,et al.  Assessing the impact of health technology assessment in the Netherlands , 2008, International Journal of Technology Assessment in Health Care.

[50]  Ben R. Martin,et al.  The Research Excellence Framework and the ‘impact agenda’: are we creating a Frankenstein monster? , 2011 .

[51]  Rosa Scoble,et al.  Institutional strategies for capturing socio-economic impact of academic research , 2010 .

[52]  Jennifer Rubin,et al.  Assessing policy and practice impacts of social science research: the application of the Payback Framework to assess the Future of Work programme , 2011 .

[53]  N. Kerr,et al.  Bias in judgment: Comparing individuals and groups. , 1996 .

[54]  Jochen Gläser,et al.  Advantages and dangers of ‘remote’ peer evaluation , 2005 .

[55]  Michael Obrecht,et al.  Examining the value added by committee discussion in the review of applications for research awards , 2007 .

[56]  Barry Bozeman,et al.  Broad Impacts and Narrow Perspectives: Passing the Buck on Science and Social Impacts , 2009 .