Measuring researcher independence using bibliometric data: A proposal for a new performance indicator

Bibliometric indicators are increasingly used to evaluate individual scientists–as is exemplified by the popularity of the many other publication and citation-based indicators used in evaluation. These indicators, however, cover at best some of the quality dimensions relevant for assessing a researcher: productivity and impact. At the same time, research quality has more dimensions than productivity and impact alone. As current bibliometric indicators are not covering various important quality dimensions, we here contribute to developing better indicators for those quality dimensions not yet addressed. One of the quality dimensions lacking valid indicators is an individual researcher’s independence. We propose indicators to measure different aspects of independence: two assessing whether a researcher has developed an own collaboration network and two others assessing the level of thematic independence. Taken together they form an independence indicator. We illustrate how these indicators distinguish between researchers that are equally productive and have a considerable impact. The independence indicator is a step forward in evaluating individual scholarly quality.

[1]  Johan Bollen,et al.  A Principal Component Analysis of 39 Scientific Impact Measures , 2009, PloS one.

[2]  Qi Wang,et al.  A bibliometric model for identifying emerging research topics , 2017, J. Assoc. Inf. Sci. Technol..

[3]  Thed N. van Leeuwen,et al.  A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact , 2010, J. Assoc. Inf. Sci. Technol..

[4]  E. Barlösius Concepts of Originality in the Natural Science, Medical, and Engineering Disciplines: An Analysis of Research Proposals , 2018, Science, Technology, & Human Values.

[5]  Michael McGill,et al.  Introduction to Modern Information Retrieval , 1983 .

[6]  David Moher,et al.  Assessing scientists for hiring, promotion, and tenure , 2018, PLoS biology.

[7]  Margit Osterloh,et al.  Research Governance in Academia: Are There Alternatives to Academic Rankings? , 2009, SSRN Electronic Journal.

[8]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[9]  Sven Hemlin,et al.  Research production in the arts and humanities , 1996, Scientometrics.

[10]  R. Whitley,et al.  The Intellectual and Social Organization of the Sciences (Second Edition: with new introductory chapter entitled 'Science Transformed? The Changing Nature of Knowledge Production at the End of the Twentieth Century') , 2000 .

[11]  Elizabeth S. Vieira,et al.  How good is a model based on bibliometric indicators in predicting the final decisions made by peers? , 2014, J. Informetrics.

[12]  Maria Nedeva,et al.  Characterizing researchers to study research funding agency impacts: The case of the European Research Council's Starting Grants , 2012 .

[13]  U. Sandström,et al.  Quantity and/or Quality? The Importance of Publishing Many Papers , 2016, PloS one.

[14]  Peter van den Besselaar,et al.  Perverse effects of output-based research funding? Butler's Australian case revisited , 2017, J. Informetrics.

[15]  Birger Larsen,et al.  A review of the characteristics of 108 author-level bibliometric indicators , 2014, Scientometrics.

[16]  Lutz Bornmann,et al.  How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations , 2013, Scientometrics.

[17]  Cinzia Daraio,et al.  Do social sciences and humanities behave like life and hard sciences? , 2017, Scientometrics.

[18]  Ulf Sandström,et al.  Meeting the Micro-Level Challenges : Bibliometrics at the Individual Level , 2009 .

[19]  Jian Wang,et al.  Bias Against Novelty in Science: A Cautionary Tale for Users of Bibliometric Indicators , 2015 .

[20]  Inge van der Weijden,et al.  Different views on scholarly talent: What are the talents we are looking for in science? , 2014 .

[21]  Sven Hemlin,et al.  Scientific quality in the eyes of the scientist. A questionnaire study , 1993, Scientometrics.

[22]  Sven E. Hug,et al.  Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history , 2013 .

[23]  L. Leydesdorff,et al.  The Continuing Growth of Global Cooperation Networks in Research: A Conundrum for National Governments , 2015, PloS one.

[24]  D. Chubin,et al.  Peerless Science: Peer Review and U. S. Science Policy , 1990 .

[25]  Lutz Bornmann,et al.  Distributions instead of single numbers: Percentiles and beam plots for the assessment of single researchers , 2014, J. Assoc. Inf. Sci. Technol..

[26]  Mike Thelwall,et al.  The metric tide: report of the independent review of the role of metrics in research assessment and management , 2015 .

[27]  J. Youtie,et al.  The Strength in Numbers: The New Science of Team Science , 2017 .

[28]  Peter van den Besselaar,et al.  Determinants of Success in Academic Careers , 2012 .

[29]  Paul T. Groth,et al.  Identifying research talent using web-centric databases , 2013, WebSci.

[30]  Benjamin F. Jones,et al.  Atypical Combinations and Scientific Impact , 2013, Science.

[31]  Ying Ding,et al.  Discovering author impact: A PageRank perspective , 2010, Inf. Process. Manag..

[32]  Cassidy R. Sugimoto,et al.  Mapping world scientific collaboration: Authors, institutions, and countries , 2012, J. Assoc. Inf. Sci. Technol..

[33]  Vincent Larivière,et al.  The rise of the middle author: Investigating collaboration and division of labor in biomedical research using partial alphabetical authorship , 2017, bioRxiv.

[34]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics. , 2015, Nature.

[35]  Claire François,et al.  A concept for inferring ‘frontier research’ in grant proposals , 2013, Scientometrics.

[36]  Ludo Waltman,et al.  A smart local moving algorithm for large-scale modularity-based community detection , 2013, The European Physical Journal B.

[37]  T. Cech,et al.  Fostering innovation and discovery in biomedical research. , 2005, JAMA.

[38]  A. V. van Raan,et al.  Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods , 2005 .

[39]  Dorte Henriksen,et al.  The rise in co-authorship in the social sciences (1980–2013) , 2016, Scientometrics.

[40]  C. Wennerås,et al.  Nepotism and sexism in peer-review , 1997, Nature.

[41]  Jörg Neufeld,et al.  Peer review-based selection decisions in individual research funding, applicants’ publication strategies and performance: The case of the ERC Starting Grants , 2013 .