The effects of funding and co-authorship on research performance in a small scientific community

The evaluation of research performance increasingly relies on quantitative indicators determined by national science policies. We focus on two dimensions of research performance—productivity and excellence—as defined in the evaluation methodology of the Slovenian Research Agency. Our analysis focuses on the effects of two science policy factors—co-authorship collaboration and researcher funding—on the productivity and excellence of Slovenian researchers at the level of research disciplines. A multilevel analysis using a hierarchical linear model with regression analysis was applied to the data with several nested levels. As many variables have a semi-continuous distribution, a statistical model was used to address them. The results show a very strong positive effect of international co-authorship collaboration on productivity and excellence, while fragmentation of funding shows a negative impact only on excellence. We also include interviews with excellent Slovenian researchers regarding their views on scientific excellence and quantitative indicators.

[1]  J. Ziman Real Science: What It Is and What It Means , 2000 .

[2]  Luka Kronegger,et al.  Co-authorship trends and collaboration patterns in the Slovenian sociological community , 2010 .

[3]  Luka Kronegger,et al.  Collaboration structures in Slovenian scientific communities , 2012, Scientometrics.

[4]  Joseph L Schafer,et al.  A Two-Part Random-Effects Model for Semicontinuous Longitudinal Data , 2001 .

[5]  Franc Mali Policy Issues of the International Productivity and Visibility of the Social Sciences in Central and Eastern European Countries , 2010 .

[6]  Hugo Horta,et al.  How does size matter for science? Exploring the effects of research unit size on academics scientific productivity and information exchange behaviors , 2011 .

[7]  I. Kovács,et al.  Internationalisation of Social Sciences in Central and Eastern Europe : The ‘Catching Up’ -- A Myth or a Strategy? , 2010 .

[8]  R. Barré,et al.  Towards socially robust S&T indicators: indicators as debatable devices, enabling collective learning , 2010 .

[9]  H. Carvalho,et al.  The Relationship between S&T Development and (Social) Science Productivity in Europe , 2011 .

[10]  Norman Kaplan,et al.  The Sociology of Science: Theoretical and Empirical Investigations , 1974 .

[11]  Richard Whitley Changing Governance of the Public Sciences , 2007 .

[12]  Franc Mali,et al.  Why an Unbiased External R&D Evaluation System is Important for the Progress of Social Sciences—the Case of a Small Social Science Community , 2013 .

[13]  C. Bloch,et al.  The size of research funding: Trends and implications , 2015 .

[14]  Chia-Liang Hung,et al.  Evaluating project performance by removing external effects: Implications to the efficiency of research and development resource allocation , 2014 .

[15]  Luka Kronegger,et al.  Classifying scientific disciplines in Slovenia: A study of the evolution of collaboration structures , 2015, J. Assoc. Inf. Sci. Technol..

[16]  Hans-Dieter Daniel,et al.  Four types of research in the humanities: Setting the stage for research quality criteria in the humanities , 2012 .

[17]  I. Ràfols,et al.  The dilemmas of performance indicators of individual researchers – An urgent debate in bibliometrics , 2013 .

[18]  Julita Jabłecka,et al.  Comparing the organization of public research funding in central and eastern European countries , 2009 .

[19]  T. Hellström Evaluation of artistic research , 2010 .

[20]  M. Frankel,et al.  Evaluating science and scientists : an East-West dialogue on research evaluation in post-communist Europe , 1997 .

[21]  Roger Guimerà,et al.  Team Assembly Mechanisms Determine Collaboration Network Structure and Team Performance , 2005, Science.

[22]  Ernesto de los Reyes López,et al.  Dimensions of scientific collaboration and its contribution to the academic research groups' scientific quality , 2009 .

[23]  P. Boardman,et al.  Influencing scientists’ collaboration and productivity patterns through new institutions: University research centers and scientific and technical human capital☆ , 2010 .

[24]  Loet Leydesdorff,et al.  International collaboration in science and the formation of a core group , 2008, J. Informetrics.

[25]  Loet Leydesdorff,et al.  Network Structure, Self-Organization and the Growth of International Collaboration in Science.Research Policy, 34(10), 2005, 1608-1618. , 2005, 0911.4299.

[26]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[27]  Edward W. Frees,et al.  Regression Modeling with Actuarial and Financial Applications , 2009 .

[28]  B. Lepori,et al.  Public research funding systems in central and eastern europe: Between excellence and relevance: Introduction to special section , 2009 .

[29]  Andrew M. Jones Chapter 6 – Health Econometrics* , 2000 .

[30]  Steven Wooding,et al.  Capturing Research Impacts , 2010 .

[31]  Thed N. van Leeuwen,et al.  Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? , 2002, Scientometrics.

[32]  Halla Thorsteinsdóttir,et al.  Public-sector research in small countries: does size matter? , 2000 .

[33]  C. Bloch,et al.  Excellence in the knowledge-based economy: from scientific to research excellence , 2016 .

[34]  Finn Hansson,et al.  Measuring research performance during a changing relationship between science and society , 2011 .

[35]  Luka Kronegger,et al.  Scientific collaboration dynamics in a national scientific system , 2015, Scientometrics.

[36]  Christina Courtright,et al.  Context in information behavior research , 2007 .

[37]  Luka Kronegger,et al.  Dynamic Scientific Co-Authorship Networks , 2012 .

[38]  Franci Čuš Franci Demšar: Transparentnost in skrb za denar davkoplačevalcev , 2013 .