Perverse effects of output-based research funding? Butler's Australian case revisited

More than ten years ago, Linda Butler (2003a) published a well-cited article claiming that the Australian science policy in the early 1990s made a mistake by introducing output based funding. According to Butler, the policy stimulated researchers to publish more but at the same time less good papers, resulting in lower total impact of Australian research compared to other countries. We redo and extend the analysis using longer time series, and show that Butlers’ main conclusions are not correct. We conclude in this paper (i) that the currently available data reject Butler’s claim that “journal publication productivity has increased significantly… but its impact has declined”, and (ii) that it is hard to find such evidence also with a reconstruction of her data. On the contrary, after implementing evaluation systems and performance based funding, Australia not only improved its share of research output but also increased research quality, implying that total impact was greatly increased. Our findings show that if output based research funding has an effect on research quality, it is positive and not negative. This finding has implications for the discussions about research evaluation and about assumed perverse effects of incentives, as in those debates the Australian case plays a major role.

[1]  Ulf Sandström,et al.  The Complex Relationship between Competitive Funding and Performance , 2014 .

[2]  L. Butler A list of published papers is no measure of value , 2002, Nature.

[3]  A. Raan,et al.  A bibliometric analysis of six economics research groups: A comparison with peer review , 1993 .

[4]  Cassidy R. Sugimoto,et al.  Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact , 2015, Online Inf. Rev..

[5]  Peter van den Besselaar,et al.  From bench to bedside: The societal orientation of research leaders: The case of biomedical and health research in the Netherlands , 2012 .

[6]  W. Broad The publishing game: getting more for less. , 1981, Science.

[7]  Diana Hicks,et al.  Evolving regimes of multi-university research evaluation , 2009 .

[8]  Kaare Aagaard,et al.  What happens when national research funding is linked to differentiated publication counts? A comparison of the Australian and Norwegian publication-based funding models , 2016 .

[9]  Michael H. MacRoberts,et al.  Problems of citation analysis: A critical review , 1989, JASIS.

[10]  Anthony F. J. van Raan,et al.  Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises , 1996, Scientometrics.

[11]  D. Julian,et al.  The evaluation of scientific research , 1990 .

[12]  Peter van den Besselaar,et al.  How do young tenured professors benefit from a mentor? Effects on management, motivation and performance , 2014 .

[13]  Linda Butler,et al.  Impacts of performance-based research funding systems: A review of the concerns and the evidence , 2010 .

[14]  Donald C. Pelz,et al.  Scientists in Organizations: Productive Climates for Research and Development , 1967 .

[15]  N. Mohaghegh,et al.  WHY THE IMPACT FACTOR OF JOURNALS SHOULD NOT BE USED FOR EVALUATING RESEARCH , 2005 .

[16]  Diana Hicks,et al.  One size doesn't fit all: On the co-evolution of national evaluation systems and social science publishing , 2012 .

[17]  Vincent Larivière,et al.  How Many is too Many? On the Relationship between Output and Impact in Research , 2015, ISSI.

[18]  Ludo Waltman,et al.  Field-Normalized Citation Impact Indicators and the Choice of an Appropriate Counting Method , 2015, ISSI.

[19]  D. Evered,et al.  The evaluation of scientific research , 1989 .

[20]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[21]  Dora Marinova,et al.  The changing research funding regime in Australia and academic productivity , 2008, Math. Comput. Simul..

[22]  T. J. Phelan,et al.  A compendium of issues for citation analysis , 1999, Scientometrics.

[23]  Linda Butler,et al.  What Happens when Funding is Linked to Publication Counts , 2004 .

[24]  B. Martin,et al.  Some partial indicators of scientific progress in radio astronomy , 1983 .

[25]  D. L. Smith,et al.  Frequency of Citations as Criterion for the Ranking of Departments, Journals, and Individuals , 1978 .

[26]  Paula E. Stephan,et al.  Research efficiency: Perverse incentives , 2012, Nature.

[27]  Vincent Larivière,et al.  The weakening relationship between the impact factor and papers' citations in the digital age , 2012, J. Assoc. Inf. Sci. Technol..

[28]  U. Sandström,et al.  Quantity and/or Quality? The Importance of Publishing Many Papers , 2016, PloS one.

[29]  L. Butler,et al.  Modifying publication practices in response to funding formulas , 2003 .

[30]  Deborah Cox,et al.  Understanding societal impact through productive interactions: ICT research as a case , 2014 .

[31]  B. Martin,et al.  Assessing Basic Research : Some Partial Indicators of Scientific Progress in Radio Astronomy : Research Policy , 1987 .

[32]  Deborah Cox,et al.  Understanding societal impact through studying productive interactions ICT research in the UK and the Netherlands , 2013 .

[33]  D. Simonton Creativity in Science: Chance, Logic, Genius, and Zeitgeist , 2004 .

[34]  V. Larivière,et al.  How Many Is Too Many? On the Relationship between Research Productivity and Impact , 2016, PloS one.

[35]  Paul Wouters,et al.  Evaluation practices and effects of indicator use : a literature review , 2016 .

[36]  D. Hicks Performance-based university research funding systems , 2012 .

[37]  Inge van der Weijden,et al.  Different views on scholarly talent: What are the talents we are looking for in science? , 2014 .

[38]  Peter van den Besselaar,et al.  Does Quantity Make a Difference? , 2015, ISSI.

[39]  Andy Walker Perverse incentives. , 2013, Tennessee medicine : journal of the Tennessee Medical Association.

[40]  Peter van den Besselaar,et al.  Counterintuitive effects of incentives , 2017 .

[41]  Laura Cruz-Castro,et al.  Overturning some assumptions about the effects of evaluation systems on publication performance , 2011, Scientometrics.

[42]  B. Martin,et al.  University Research Evaluation and Funding: An International Comparison , 2003 .

[43]  L. Butler,et al.  Explaining Australia’s increased share of ISI publications—the effects of a funding formula based on publication counts , 2003 .