Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report

Bibliometric indicators such as journal impact factors, h-indices, and total citation counts are algorithmic artifacts that can be used in research evaluation and management. These artifacts have no meaning by themselves, but receive their meaning from attributions in institutional practices. We distinguish four main stakeholders in these practices: (1) producers of bibliometric data and indicators; (2) bibliometricians who develop and test indicators; (3) research managers who apply the indicators; and (4) the scientists being evaluated with potentially competing career interests. These different positions may lead to different and sometimes conflicting perspectives on the meaning and value of the indicators. The indicators can thus be considered as boundary objects which are socially constructed in translations among these perspectives. This paper proposes an analytical clarification by listing an informed set of (sometimes unsolved) problems in bibliometrics which can also shed light on the tension between simple but invalid indicators that are widely used (e.g., the h-index) and more sophisticated indicators that are not used or cannot be used in evaluation practices because they are not transparent for users, cannot be calculated, or are difficult to interpret.

[1]  B. Jay Coleman,et al.  Benchmarking Individual Publication Productivity in Logistics , 2012 .

[2]  Harsha Aturupane Poverty, human development and growth : an emerging consensus? , 1994 .

[3]  Isidro F. Aguillo Is Google Scholar useful for bibliometrics? A webometric analysis , 2012, Scientometrics.

[4]  L. Bornmann,et al.  A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants , 2010, PloS one.

[5]  Nicolás Robinson-García,et al.  The Google scholar experiment: How to index false papers and manipulate bibliometric indicators , 2013, J. Assoc. Inf. Sci. Technol..

[6]  Ludo Waltman,et al.  A new methodology for constructing a publication-level classification system of science , 2012, J. Assoc. Inf. Sci. Technol..

[7]  John P. A. Ioannidis,et al.  Citation Metrics: A Primer on How (Not) to Normalize , 2016, PLoS biology.

[8]  Ludo Waltman,et al.  Field-normalized citation impact indicators using algorithmically constructed classification systems of science , 2015, J. Informetrics.

[9]  C. Anderson‐Cook,et al.  Group-Based Modeling of Development , 2006 .

[10]  Anton J. Nederhof,et al.  Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review , 2006, Scientometrics.

[11]  Henk F. Moed,et al.  Measuring contextual citation impact of scientific journals , 2009, J. Informetrics.

[12]  Ronald N. Kostoff,et al.  Is citation normalization realistic? , 2005, J. Inf. Sci..

[13]  Loet Leydesdorff,et al.  The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits , 2012, J. Informetrics.

[14]  Daniele Rotolo,et al.  Matching Medline/PubMed data with Web of Science: A routine in R language , 2015, J. Assoc. Inf. Sci. Technol..

[15]  Loet Leydesdorff,et al.  Turning the tables in citation analysis one more time: Principles for comparing sets of documents by using an “Integrated Impact Indicator” (I3) , 2011 .

[16]  Thed N. van Leeuwen,et al.  Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? , 2002, Scientometrics.

[17]  P. Seglen,et al.  Education and debate , 1999, The Ethics of Public Health.

[18]  A. Oswald An Examination of the Reliability of Prestigious Scholarly Journals: Evidence and Implications for Decision-Makers , 2006, SSRN Electronic Journal.

[19]  Per O. Seglen,et al.  The Skewness of Science , 1992, J. Am. Soc. Inf. Sci..

[20]  J. Lederberg,et al.  Toward a metric of science : the advent of science indicators , 1980 .

[21]  Luk Arbuckle,et al.  Two h-Index Benchmarks for Evaluating the Publication Performance of Medical Informatics Researchers , 2012, Journal of medical Internet research.

[22]  Eugene Garfield,et al.  Is citation analysis a legitimate evaluation tool? , 2005, Scientometrics.

[23]  John Martyn,et al.  An evaluation of British scientific journals , 1968 .

[24]  Andrea Bergmann,et al.  Citation Indexing Its Theory And Application In Science Technology And Humanities , 2016 .

[25]  Susan Leigh Star,et al.  Institutional Ecology, `Translations' and Boundary Objects: Amateurs and Professionals in Berkeley's Museum of Vertebrate Zoology, 1907-39 , 1989 .

[26]  L. Egghe,et al.  Theory and practise of the g-index , 2006, Scientometrics.

[27]  Peter Weingart,et al.  Resistenz und Rezeptivität der Wissenschaft – Zu den Entstehungsbedingungen neuer Disziplinen durch wissenschaftspolitische Steuerung / Resistance and receptivity of science – On the emergence of new disciplines through science policy , 1975 .

[28]  Ismael Rafols,et al.  Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects , 2008, J. Assoc. Inf. Sci. Technol..

[29]  Stephen J. Bensman Garfield and the impact factor , 2007, Annu. Rev. Inf. Sci. Technol..

[30]  H. Simon,et al.  Near decomposability and the speed of evolution , 2002 .

[31]  Ludo Waltman,et al.  The inconsistency of the h-index , 2011, J. Assoc. Inf. Sci. Technol..

[32]  D J PRICE,et al.  NETWORKS OF SCIENTIFIC PAPERS. , 1965, Science.

[33]  Anthony F. J. van Raan,et al.  Citation Analysis May Severely Underestimate the Impact of Clinical Research as Compared to Basic Research , 2012, PloS one.

[34]  H. Schuman,et al.  Citation counts and social comparisons: Scientists' use and evaluation of citation index data , 1990 .

[35]  Lutz Bornmann,et al.  How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations , 2013, Scientometrics.

[36]  Hans-Dieter Daniel,et al.  A new reference standard for citation analysis in chemistry and related fields based on the sections of Chemical Abstracts , 2009, Scientometrics.

[37]  L. Bornmann,et al.  Macro-Indicators of Citation Impacts of Six Prolific Countries: InCites Data and the Statistical Significance of Trends , 2013, PloS one.

[38]  Andreas Thor,et al.  The application of bibliometrics to research evaluation in the humanities and social sciences: An exploratory study using normalized Google Scholar data for the publications of a research institute , 2016, J. Assoc. Inf. Sci. Technol..

[39]  E. Garfield Citation analysis as a tool in journal evaluation. , 1972, Science.

[40]  T. Gieryn Boundary-work and the demarcation of science from non-science: Strains and interests in professional , 1983 .

[41]  S. Schwartzman,et al.  The New Production of Knowledge: The Dynamics of Science and Research in Contemporary Societies , 1994 .

[42]  Lutz Bornmann,et al.  Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages in field-normalization , 2011, J. Informetrics.

[43]  Claes H. de Vreese,et al.  A threat called Turkey: Perceived religious threat and support for EU entry of Croatia, Switzerland and Turkey , 2013 .

[44]  Loet Leydesdorff,et al.  A review of theory and practice in scientometrics , 2015, Eur. J. Oper. Res..

[45]  Loet Leydesdorff,et al.  Article in Press G Model Journal of Informetrics a Meta-evaluation of Scientific Research Proposals: Different Ways of Comparing Rejected to Awarded Applications , 2022 .

[46]  M. Mulkay,et al.  Opening Pandora's Box: A Sociological Analysis of Scientists' Discourse , 1984 .

[47]  Anne-Wil Harzing,et al.  A preliminary test of Google Scholar as a source for citation data: a longitudinal study of Nobel prize winners , 2013, Scientometrics.

[48]  Ismael Rafols,et al.  How journal rankings can suppress interdisciplinary research: A comparison between Innovation Stud , 2012 .

[49]  Thed N. van Leeuwen,et al.  New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications , 1995, Scientometrics.

[50]  Vincent Larivière,et al.  The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities , 2006, J. Assoc. Inf. Sci. Technol..

[51]  Joost Kosten A classification of the use of research indicators , 2016, Scientometrics.

[52]  Alexander I. Pudovkin,et al.  Algorithmic procedure for finding semantically related journals , 2002, J. Assoc. Inf. Sci. Technol..

[53]  Howard Hunt Pattee,et al.  Hierarchy Theory: The Challenge of Complex Systems , 1973 .

[54]  Michel Zitt,et al.  Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation , 2005, Scientometrics.

[55]  J. E. Hirsch,et al.  An index to quantify an individual's scientific research output , 2005, Proc. Natl. Acad. Sci. USA.

[56]  W. S. Robinson,et al.  Ecological correlations and the behavior of individuals. , 1950, International journal of epidemiology.

[57]  Z. Griliches Productivity, R&D, and the Data Constraint , 1998 .

[58]  Jian Wang,et al.  Citation time window choice for research impact evaluation , 2013, Scientometrics.

[59]  Alan L. Mackay,et al.  Publish or perish , 1974, Nature.

[60]  Norman Louat,et al.  The evaluation of , 1974 .

[61]  Alexander Rushforth,et al.  Accounting for Impact? The Journal Impact Factor and the Making of Biomedical Research in the Netherlands , 2015, Minerva.

[62]  A. Stirling “Opening Up” and “Closing Down” , 2008 .

[63]  Thed N. van Leeuwen,et al.  Some modifications to the SNIP journal impact indicator , 2012, J. Informetrics.

[64]  Vincent Larivière,et al.  The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities: Research Articles , 2006 .

[65]  Anne-Wil Harzing,et al.  A longitudinal study of Google Scholar coverage between 2012 and 2013 , 2013, Scientometrics.

[66]  Lutz Bornmann,et al.  A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants , 2011, J. Informetrics.

[67]  Tibor Braun,et al.  Relative indicators and relational charts for comparative assessment of publication output and citation impact , 1986, Scientometrics.

[68]  H. Simon,et al.  The Organization of Complex Systems , 1977 .

[69]  Giovanni Abramo,et al.  A farewell to the MNCS and like size-independent indicators , 2016, J. Informetrics.

[70]  Ronald Rousseau,et al.  An approach for efficient online identification of the top-k percent most cited documents in large sets of Web of Science documents , 2014 .

[71]  E. Leeuw,et al.  The See-Saw Effect: a multilevel problem? , 1988 .

[72]  Loet Leydesdorff,et al.  The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies” , 2014, J. Assoc. Inf. Sci. Technol..

[73]  Peter Dahler-Larsen The Evaluation Society , 2011 .

[74]  Loet Leydesdorff,et al.  Past performance, peer review and project selection: a case study in the social and behavioral sciences , 2009, 0911.1306.

[75]  Péter Jacsó,et al.  Google Scholar Author Citation Tracker: is it too little, too late? , 2012 .

[76]  Péter Jacsó,et al.  Google Scholar Metrics for Publications , 2012 .

[77]  Wiebe E. Bijker,et al.  Science in action : how to follow scientists and engineers through society , 1989 .

[78]  Werner Marx,et al.  Special features of historical papers from the viewpoint of bibliometrics , 2011, J. Assoc. Inf. Sci. Technol..

[79]  Jos A. E. Spaan The danger of pseudoscience in Informetrics , 2010, J. Informetrics.

[80]  Reinier Plomp,et al.  The significance of the number of highly cited papers as an indicator of scientific prolificacy , 1990, Scientometrics.

[81]  Loet Leydesdorff,et al.  Scopus' SNIP indicator: Reply to Moed , 2011, J. Assoc. Inf. Sci. Technol..

[82]  John Mingers Problems with the SNIP indicator , 2014, J. Informetrics.

[83]  Mike Thelwall,et al.  The metric tide: report of the independent review of the role of metrics in research assessment and management , 2015 .

[84]  Joshua Lederberg,et al.  [Introduction to "Toward A Metric of Science: The Advent of Science Indicators"] , 1979 .

[85]  魏屹东,et al.  Scientometrics , 2018, Encyclopedia of Big Data.

[86]  Stefan Hornbostel,et al.  Postdocs in Deutschland: Evaluation des Emmy Noether-Programms , 2008 .

[87]  Loet Leydesdorff,et al.  Group‐based trajectory modeling (GBTM) of citations in scholarly literature: Dynamic qualities of “transient” and “sticky knowledge claims” , 2013, J. Assoc. Inf. Sci. Technol..

[88]  Thed N. van Leeuwen,et al.  The Leiden ranking 2011/2012: Data collection, indicators, and interpretation , 2012, J. Assoc. Inf. Sci. Technol..

[89]  Loet Leydesdorff,et al.  Citation Analysis with Medical Subject Headings (MeSH) using the Web of Knowledge: A new routine , 2012, J. Assoc. Inf. Sci. Technol..

[90]  Loet Leydesdorff,et al.  Integrated Impact Indicators (I3) compared with Impact Factors (IFs): An alternative research design with policy implications , 2011, J. Assoc. Inf. Sci. Technol..

[91]  「論壇人間文化」編集部会 The humanities review = 論壇人間文化 , 2007 .

[92]  Ludo Waltman,et al.  A systematic empirical comparison of different approaches for normalizing citation impact indicators , 2013, J. Informetrics.

[93]  Vincent Larivière,et al.  A simple proposal for the publication of journal citation distributions , 2016, bioRxiv.

[94]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[95]  Loet Leydesdorff,et al.  UvA-DARE ( Digital Academic Repository ) Citations : Indicators of Quality ? The Impact Fallacy , 2016 .

[96]  Ismael Rafols,et al.  How journal rankings can suppress interdisciplinarity. The case of innovation studies in business and management , 2011, ArXiv.

[97]  Thed N. van Leeuwen,et al.  Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data , 2016 .

[98]  Ludo Waltman,et al.  A review of the literature on citation impact indicators , 2015, J. Informetrics.