The evaluation of citation distributions

This paper reviews a number of recent contributions that demonstrate that a blend of welfare economics and statistical analysis is useful in the evaluation of the citations received by scientific papers in the periodical literature. The paper begins by clarifying the role of citation analysis in the evaluation of research. Next, a summary of results about the citation distributions’ basic features at different aggregation levels is offered. These results indicate that citation distributions share the same broad shape, are highly skewed, and are often crowned by a power law. In light of this evidence, a novel methodology for the evaluation of research units is illustrated by comparing the high- and low-citation impact achieved by the US, the European Union, and the rest of the world in 22 scientific fields. However, contrary to recent claims, it is shown that mean normalization at the sub-field level does not lead to a universal distribution. Nevertheless, among other topics subject to ongoing research, it appears that this lack of universality does not preclude sensible normalization procedures to compare the citation impact of articles in different scientific fields.

[1]  Javier Ruiz-Castillo Ucelay The evaluation of citation distributions , 2012 .

[2]  Anthony F. J. van Raan,et al.  Universality of citation distributions revisited , 2011, J. Assoc. Inf. Sci. Technol..

[3]  A. Sen,et al.  Poverty: An Ordinal Approach to Measurement , 1976 .

[4]  A. D. Jackson,et al.  Citation networks in high energy physics. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  Thed N. van Leeuwen,et al.  The Holy Grail of science policy: Exploring and combining bibliometric tools in search of scientific excellence , 2003, Scientometrics.

[6]  Thierry Marchant,et al.  An axiomatic characterization of the ranking based on the h-index and some other bibliometric rankings of authors , 2009, Scientometrics.

[7]  Antonio Quesada,et al.  Monotonicity and the Hirsch index , 2009, J. Informetrics.

[8]  Patrick Llerena,et al.  Science-Technology-Industry Links and the ”European Paradox”: Some Notes on the Dynamics of Scientific and Technological Research in Europe , 2005 .

[9]  Jean-François Molinari,et al.  Mathematical aspects of a new criterion for ranking scientific institutions based on the h-index , 2008, Scientometrics.

[10]  Henk F. Moed,et al.  Handbook of Quantitative Science and Technology Research: The Use of Publication and Patent Statistics in Studies of S&T Systems , 2004 .

[11]  Thed N. van Leeuwen,et al.  Language biases in the coverage of the Science Citation Index and its consequencesfor international comparisons of national research performance , 2001, Scientometrics.

[12]  EUROPEAN PARLIAMENT DIRECTORATE GENERAL FOR RESEARCH , 2000 .

[13]  Mark E. J. Newman,et al.  Power-Law Distributions in Empirical Data , 2007, SIAM Rev..

[14]  Pedro Albarrán,et al.  References made and citations received by scientific articles , 2011, J. Assoc. Inf. Sci. Technol..

[15]  Pedro Albarrán,et al.  A comparison of the scientific performance of the U.S. and the European Union at the turn of the XXI century , 2009 .

[16]  Steve Pressé,et al.  Nonuniversal power law scaling in the probability distribution of scientific citations , 2010, Proceedings of the National Academy of Sciences.

[17]  L. Egghe Power Laws in the Information Production Process: Lotkaian Informetrics , 2005 .

[18]  Peter Nijkamp,et al.  Accessibility of Cities in the Digital Economy , 2004, cond-mat/0412004.

[19]  James E. Foster,et al.  SUBGROUP CONSISTENT POVERTY INDICES , 1991 .

[20]  Thed N. van Leeuwen,et al.  Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? , 2002, Scientometrics.

[21]  Jean-François Molinari,et al.  A new methodology for ranking scientific institutions , 2008, Scientometrics.

[22]  A. Raan Measuring Science: Capita Selecta of Current Main Issues , 2004 .

[23]  Anthony F. J. van Raan,et al.  Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods , 2005, Scientometrics.

[24]  Wolfgang Glänzel,et al.  The application of characteristic scores and scales to the evaluation and ranking of scientific journals , 2011, J. Inf. Sci..

[25]  E. Thorbecke,et al.  A Class of Decomposable Poverty Measures , 1984 .

[26]  J. Ruiz-Castillo,et al.  Average-Based Versus High- and Low-Impact Indicators for the Evaluation of Scientific Distributions , 2010 .

[27]  M. E. J. Newman,et al.  Power laws, Pareto distributions and Zipf's law , 2005 .

[28]  Thed N. van Leeuwen,et al.  New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications , 1995, Scientometrics.

[29]  Brian Sloan,et al.  European research policy and bibliometric indicators, 1990–2005 , 2010, Scientometrics.

[30]  Gerhard J. Woeginger,et al.  A symmetry axiom for scientific impact indices , 2008, J. Informetrics.

[31]  Antonio Quesada,et al.  More axiomatics for the Hirsch index , 2010, Scientometrics.

[32]  Loet Leydesdorff,et al.  Is the United States losing ground in science? A global perspective on the world science system , 2009, Scientometrics.

[33]  Claudio Castellano,et al.  Universality of citation distributions: Toward an objective measure of scientific impact , 2008, Proceedings of the National Academy of Sciences.

[34]  Peter Weingart,et al.  Impact of bibliometrics upon the science system: Inadvertent consequences? , 2005, Scientometrics.

[35]  Pedro Albarrán,et al.  The skewness of science in 219 sub-fields and a number of aggregates , 2010, Scientometrics.

[36]  Brian W. Rogers,et al.  Meeting Strangers and Friends of Friends: How Random are Social Networks? , 2007 .

[37]  Gerhard J. Woeginger,et al.  An axiomatic characterization of the Hirsch-index , 2008, Math. Soc. Sci..

[38]  Pedro Albarrán,et al.  High- and Low-Impact Citation Measures: Empirical Applications , 2010, J. Informetrics.

[39]  H. Moed,et al.  The use of bibliometric data for the measurement of university research performance , 1985 .

[40]  D. King The scientific impact of nations , 2004, Nature.

[41]  Wolfgang Glänzel,et al.  Subject field characteristic citation scores and scales for assessing research performance , 1987, Scientometrics.

[42]  Javier Ruiz-Castillo,et al.  The Measurement of Low- and High-Impact in Citation Distributions: Technical Results , 2011, J. Informetrics.

[43]  Stephen P. Jenkins,et al.  THREE ‘I’S OF POVERTY CURVES, WITH AN ANALYSIS OF UK POVERTY TRENDS , 1997 .

[44]  S. N. Dorogovtsev,et al.  Scaling properties of scale-free evolving networks: continuous approach. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[45]  Francisco Herrera,et al.  h-Index: A review focused in its variants, computation and standardization for different scientific fields , 2009, J. Informetrics.

[46]  J. E. Hirsch,et al.  An index to quantify an individual's scientific research output , 2005, Proc. Natl. Acad. Sci. USA.

[47]  Michael Mitzenmacher,et al.  A Brief History of Generative Models for Power Law and Lognormal Distributions , 2004, Internet Math..

[48]  Henk F. Moed,et al.  INDICATORS OF RESEARCH PERFORMANCE: APPLICATIONS IN UNIVERSITY RESEARCH POLICY , 1988 .

[49]  G. Dosi,et al.  The relationships between science, technologies and their industrial exploitation: An illustration through the myths and realities of the so-called ‘European Paradox’ , 2006 .

[50]  Wolfgang Glänzel,et al.  A new classification scheme of science fields and subfields designed for scientometric evaluation purposes , 2004, Scientometrics.

[51]  Lutz Bornmann,et al.  What do citation counts measure? A review of studies on citing behavior , 2008, J. Documentation.