Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level

Despite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network to field-normalize the number of citations it has received. Article citation rates are divided by an expected citation rate that is derived from performance of articles in the same field and benchmarked to a peer comparison group. The resulting Relative Citation Ratio is article level and field independent and provides an alternative to the invalid practice of using journal impact factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010 and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.

[1]  Sullivan Mx Synthetic Culture Media and the Biochemistry of bacterial Pigments. , 1905 .

[2]  E. Garfield Citation analysis as a tool in journal evaluation. , 1972, Science.

[3]  Henry G. Small,et al.  Co-citation in the scientific literature: A new measure of the relationship between two documents , 1973, J. Am. Soc. Inf. Sci..

[4]  Derek de Solla Price,et al.  A general theory of bibliometric and other cumulative advantage processes , 1976, J. Am. Soc. Inf. Sci..

[5]  H. Moed,et al.  The use of bibliometric data for the measurement of university research performance , 1985 .

[6]  W. Glänzel,et al.  Relative indicators of publication output and citation impact of european physics research, 1978–1980 , 1986 .

[7]  J. Morrison We have met the enemy and it is us! , 1994, Journal of perinatology : official journal of the California Perinatal Association.

[8]  P. Seglen,et al.  Education and debate , 1999, The Ethics of Public Health.

[9]  Rajeev Motwani,et al.  The PageRank Citation Ranking : Bringing Order to the Web , 1999, WWW 1999.

[10]  Z. Neda,et al.  Measuring preferential attachment in evolving networks , 2001, cond-mat/0104131.

[11]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[12]  Peter Vinkler Relations of relative scientometric indicators , 2004, Scientometrics.

[13]  Michael F. Huerta,et al.  NIH Roadmap Interdisciplinary Research Initiatives , 2005, PLoS Comput. Biol..

[14]  J. E. Hirsch,et al.  An index to quantify an individual's scientific research output , 2005, Proc. Natl. Acad. Sci. USA.

[15]  Tibor Braun,et al.  Relative indicators and relational charts for comparative assessment of publication output and citation impact , 1986, Scientometrics.

[16]  Not-so-deep impact , 2005, Nature.

[17]  James Lewis,et al.  Data and text mining Text similarity : an alternative way to search MEDLINE , 2006 .

[18]  Matthew J. Salganik,et al.  Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market , 2006, Science.

[19]  Sergei Maslov,et al.  Ranking scientific publications using a model of network traffic , 2006, ArXiv.

[20]  E. Garfield The history and meaning of the journal impact factor. , 2006, JAMA.

[21]  Sergei Maslov,et al.  Finding scientific gems with Google's PageRank algorithm , 2006, J. Informetrics.

[22]  Germinal Cocho,et al.  On the behavior of journal impact factor rank-order distribution , 2006, J. Informetrics.

[23]  Jonas Lundberg,et al.  Lifting the crown - citation z-score , 2007, J. Informetrics.

[24]  Mike Rossner,et al.  Show me the data , 2007, The Journal of cell biology.

[25]  Claudio Castellano,et al.  Universality of citation distributions: Toward an objective measure of scientific impact , 2008, Proceedings of the National Academy of Sciences.

[26]  M. Sales-Pardo,et al.  Effectiveness of Journal Ranking Schemes as a Tool for Locating Information , 2008, PloS one.

[27]  Michael Hahsler,et al.  Getting Things in Order: An Introduction to the R Package seriation , 2008 .

[28]  John P A Ioannidis,et al.  Life Cycle of Translational Research for Medical Interventions , 2008, Science.

[29]  Kurt Hornik,et al.  Text Mining Infrastructure in R , 2008 .

[30]  Carl T. Bergstrom,et al.  Assessing citations with the Eigenfactor™ Metrics , 2008, Neurology.

[31]  Carl T. Bergstrom,et al.  The Eigenfactor™ Metrics , 2008, The Journal of Neuroscience.

[32]  Michel Zitt,et al.  Modifying the journal impact factor by fractional citation weighting: The audience factor , 2008, J. Assoc. Inf. Sci. Technol..

[33]  Leo Egghe,et al.  Mathematical derivation of the impact factor distribution , 2009, J. Informetrics.

[34]  Systematic Differences in Impact across Publication Tracks at PNAS , 2009, PloS one.

[35]  Johan Bollen,et al.  A Principal Component Analysis of 39 Scientific Impact Measures , 2009, PloS one.

[36]  Thed N. van Leeuwen,et al.  Rivals for the crown: Reply to Opthof and Leydesdorff , 2010, J. Informetrics.

[37]  Henk F. Moed,et al.  Measuring contextual citation impact of scientific journals , 2009, J. Informetrics.

[38]  Frank-Thorsten KRELL,et al.  Should editors influence journal impact factors? , 2010, Learn. Publ..

[39]  Loet Leydesdorff,et al.  Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance , 2010, J. Informetrics.

[40]  Michael W. Berry,et al.  Text mining : applications and theory , 2010 .

[41]  Nick Cramer,et al.  Automatic Keyword Extraction from Individual Documents , 2010 .

[42]  Marta Sales-Pardo,et al.  Statistical validation of a global model for the distribution of the ultimate number of citations accrued by papers published in a scientific journal , 2010, J. Assoc. Inf. Sci. Technol..

[43]  Ying Ding,et al.  Discovering author impact: A PageRank perspective , 2010, Inf. Process. Manag..

[44]  Ludo Waltman,et al.  A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science , 2011, Scientometrics.

[45]  Loet Leydesdorff,et al.  Simple arithmetic versus intuitive understanding: The case of the impact factor , 2011 .

[46]  Thed N. van Leeuwen,et al.  Towards a new crown indicator: Some theoretical considerations , 2010, J. Informetrics.

[47]  Santo Fortunato,et al.  Characterizing and Modeling Citation Dynamics , 2011, PloS one.

[48]  W. Balistreri,et al.  Peering into peer-review. , 2011, The Journal of pediatrics.

[49]  Andrew McCallum,et al.  Database of NIH grants using machine-learned categories and graphical clustering , 2011, Nature Methods.

[50]  B. Lal,et al.  An Outcome Evaluation of the National Institutes of Health (NIH) Director's Pioneer Award (NDPA) Program, FY 2004-2006 , 2012 .

[51]  M. Thelwall,et al.  F 1000 , Mendeley and Traditional Bibliometric Indicators , 2012 .

[52]  E. Fong,et al.  Coercive Citation in Academic Publishing , 2012, Science.

[53]  Claudio Castellano,et al.  Testing the fairness of citation indicators for comparison across scientific domains: The case of fractional citation counts , 2011, J. Informetrics.

[54]  Henk F. Moed,et al.  Opinion paper: thoughts and facts on bibliometric indicators , 2012, Scientometrics.

[55]  Ludo Waltman,et al.  A new methodology for constructing a publication-level classification system of science , 2012, J. Assoc. Inf. Sci. Technol..

[56]  Thed N. van Leeuwen,et al.  Some modifications to the SNIP journal impact indicator , 2012, J. Informetrics.

[57]  B. Alberts Impact Factor Distortions , 2013, Science.

[58]  Lutz Bornmann,et al.  How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations , 2013, Scientometrics.

[59]  Gregory A Poland,et al.  Are Impact Factors corrupting truth and utility in biomedical research? , 2013, Vaccine.

[60]  Liaojun Pang,et al.  Determining scientific impact using a collaboration index , 2013, Proceedings of the National Academy of Sciences.

[61]  Loet Leydesdorff,et al.  The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000 , 2012, J. Informetrics.

[62]  Stefano Bertuzzi,et al.  No shortcuts for research assessment , 2013, Molecular biology of the cell.

[63]  Eliminating the impact of the Impact Factor , 2013, The Journal of cell biology.

[64]  The maze of impact metrics , 2013, Nature.

[65]  B. Pulverer Impact fact‐or fiction? , 2013, EMBO Journal.

[66]  G. Weber Identifying translational science within the triangle of biomedicine , 2013, Journal of Translational Medicine.

[67]  R. Schekman,et al.  Reforming research assessment , 2013, eLife.

[68]  Shaking up science. , 2013, Science.

[69]  Albert-László Barabási,et al.  Quantifying Long-Term Scientific Impact , 2013, Science.

[70]  Ludo Waltman,et al.  A systematic empirical comparison of different approaches for normalizing citation impact indicators , 2013, J. Informetrics.

[71]  C. Crous Judge research impact on a local scale , 2014, Nature.

[72]  Hao Liao,et al.  Network-Driven Reputation in Online Scientific Communities , 2014, PloS one.

[73]  David Shaw The prisoners’ dilemmas , 2014, EMBO reports.

[74]  M. Lauer,et al.  Prior Publication Productivity, Grant Percentile Ranking, and Topic-Normalized Citation Impact of NHLBI Cardiovascular R01 Grants , 2014, Circulation research.

[75]  M. Lauer,et al.  Percentile Ranking and Citation Impact of a Large Cohort of National Heart, Lung, and Blood Institute–Funded Cardiovascular R01 Grants , 2014, Circulation research.

[76]  A. Casadevall,et al.  Causes for the Persistence of Impact Factor Mania , 2014, mBio.

[77]  Ludo Waltman,et al.  Field-normalized citation impact indicators using algorithmically constructed classification systems of science , 2015, J. Informetrics.

[78]  Luis A. Nunes Amaral,et al.  The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or from an Academic Department Are Consistent with a Discrete Lognormal Model , 2015, PloS one.

[79]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[80]  Tracey A. Depellegrin,et al.  An Arbitrary Line in the Sand: Rising Scientists Confront the Impact Factor , 2015, Genetics.

[81]  Qing Ke,et al.  Defining and identifying Sleeping Beauties in science , 2015, Proceedings of the National Academy of Sciences.

[82]  Xin Yuan,et al.  Relative Citation Ratio (RCR): A new metric that uses citation rates to measure influence at the article level , 2015 .

[83]  Rodrygo L. T. Santos,et al.  Simplified Relative Citation Ratio for Static Paper Ranking: UFMG/LATIN at WSDM Cup 2016 , 2016, ArXiv.

[84]  Assessing Research Productivity , 2016 .

[85]  Loet Leydesdorff,et al.  The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies” , 2014, J. Assoc. Inf. Sci. Technol..

[86]  M. Kamal,et al.  Conotoxins: Structure, Therapeutic Potential and Pharmacological Applications. , 2016, Current pharmaceutical design.

[87]  Lutz Bornmann,et al.  Relative Citation Ratio (RCR): An empirical attempt to study a new field‐normalized bibliometric indicator , 2015, J. Assoc. Inf. Sci. Technol..

[88]  Alana Lund,et al.  Shaking It Up. , 2018 .

[89]  M. Sullivan Synthetic Culture Media and the Biochemistry of bacterial Pigments. , 2022, The Journal of medical research.