How should the societal impact of research be generated and measured? A proposal for a simple and practicable approach to allow interdisciplinary comparisons

Since the 1990s, the scope of research evaluation has widened to encompass the societal products (outputs), societal use (societal references) and societal benefits (changes in society) of research. Research evaluation has been extended to include measures of the (1) social, (2) cultural, (3) environmental and (4) economic returns from publicly funded research. Even though no robust or reliable methods for measuring societal impact have yet been developed. In this study, we would like to introduce an approach which, unlike the currently common case study approach (and others), is relatively simple, can be used in almost every subject area and delivers results regarding societal impact which can be compared between disciplines. Our approach to societal impact starts with the actual function of science in society: to generate reliable knowledge. That is why a study (which we would like to refer to as an assessment report) summarising the status of the research on a certain subject represents knowledge which is available for society to access. Societal impact is given when the content of a report is addressed outside of science (in a government document, for example).

[1]  Nicholas A. Bowman,et al.  Effect Sizes and Statistical Methods for Meta-Analysis in Higher Education , 2012 .

[2]  Ingeborg Meijer,et al.  Societal output and use of research performed by health research groups , 2010, Health research policy and systems.

[3]  Xavier Donadeu,et al.  Encyclopedia of Sustainability Science and Technology , 2012 .

[4]  Lutz Bornmann,et al.  A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants , 2011, J. Informetrics.

[5]  L. Bornmann,et al.  Gender Effects in the Peer Reviews of Grant Proposals: A Comprehensive Meta-Analysis Comparing Traditional and Multilevel Approaches , 2009 .

[6]  S. Goodman,et al.  A Systematic Examination of the Citation of Prior Research in Reports of Randomized, Controlled Trials , 2011, Annals of Internal Medicine.

[7]  G. Matt,et al.  What meta-analyses have and have not taught us about psychotherapy effects: a review and future directions. , 1997, Clinical psychology review.

[8]  Jean-Claude Petit,et al.  Why do we need fundamental research? , 2004, European Review.

[9]  U. Segerstråle,et al.  Real Science. What it is, and what it means , 2001 .

[10]  G. Glass Primary, Secondary, and Meta-Analysis of Research1 , 1976 .

[11]  A. Salter,et al.  The economic benefits of publicly funded basic research: a critical review , 2001 .

[12]  魏屹东,et al.  Scientometrics , 2018, Encyclopedia of Big Data.

[13]  Annette Boaz,et al.  Real-world approaches to assessing the impact of environmental research on policy , 2011 .

[14]  Pirjo Kutinlahti,et al.  Research with an impact: Evaluation practices in public research organisations , 2006 .

[15]  Manfred Maier,et al.  Development of a practical tool to measure the impact of publications on the society based on focus group discussions with scientists , 2011, BMC public health.

[16]  Christopher Hubert Llewellyn Smith What's the use of basic science? , 1997 .

[17]  Lutz Bornmann,et al.  Measuring the societal impact of research , 2012, EMBO reports.

[18]  Andreas Pietz,et al.  Nothing but the Truth , 2013, J. Philos. Log..

[19]  E. Hazelkorn Assessing Europe's University-Based Research , 2010 .

[20]  C. Donovan,et al.  State of the art in assessing research impact: introduction to a special issue , 2011 .

[21]  Lutz Bornmann,et al.  Scientific peer review , 2011, Annu. Rev. Inf. Sci. Technol..

[22]  Lutz Bornmann,et al.  What do citation counts measure? A review of studies on citing behavior , 2008, J. Documentation.

[23]  Lutz Bornmann,et al.  Gender differences in grant peer review: A meta-analysis , 2007, J. Informetrics.

[24]  Arie Rip,et al.  Evaluation of societal quality of public sector research in the Netherlands , 2000 .

[25]  Norman Kaplan,et al.  The Sociology of Science: Theoretical and Empirical Investigations , 1974 .

[26]  Ben R. Martin,et al.  The Research Excellence Framework and the ‘impact agenda’: are we creating a Frankenstein monster? , 2011 .

[27]  Finn Hansson,et al.  Measuring research performance during a changing relationship between science and society , 2011 .

[28]  Heather A. Piwowar,et al.  Altmetrics: Value all research products , 2013, Nature.

[29]  Sheila Wilson,et al.  Research Excellence Framework , 2013 .

[30]  Les Rymer,et al.  Measuring the Impact of Research--The Context for Metric Development. Go8 Backgrounder 23. , 2011 .

[31]  Jørgen Gulddahl Rasmussen,et al.  Linking between Danish universities and society , 2009 .

[32]  Paige Brown Nothing but the truth , 2012, EMBO reports.

[33]  Ingram Olkin,et al.  Are Organic Foods Safer or Healthier Than Conventional Alternatives? , 2012, Annals of Internal Medicine.

[34]  J. Britt Holbrook,et al.  The use of societal impacts considerations in grant proposal peer review: A comparison of five models , 2010 .

[35]  R. Frodeman,et al.  Peer review and the ex ante assessment of societal impacts , 2011 .

[36]  Lutz Bornmann,et al.  What is societal impact of research and how can it be assessed? a literature survey , 2013, J. Assoc. Inf. Sci. Technol..

[37]  J. Britt Holbrook,et al.  Comparative Assessment of Peer Review (CAPR): EU/US workshop on peer review: Assessing "broader impact" in research grant applications , 2010 .