The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments

In this paper we study the effects of field normalization baseline on relative performance of 20 natural science departments in terms of citation impact. Impact is studied under three baselines: journal, ISI/Thomson Reuters subject category, and Essential Science Indicators field. For the measurement of citation impact, the indicators item-oriented mean normalized citation rate and Top-5% are employed. The results, which we analyze with respect to stability, show that the choice of normalization baseline matters. We observe that normalization against publishing journal is particular. The rankings of the departments obtained when journal is used as baseline, irrespective of indicator, differ considerably from the rankings obtained when ISI/Thomson Reuters subject category or Essential Science Indicators field is used. Since no substantial differences are observed when the baselines Essential Science Indicators field and ISI/Thomson Reuters subject category are contrasted, one might suggest that people without access to subject category data can perform reasonable normalized citation impact studies by combining normalization against journal with normalization against Essential Science Indicators field.

[1]  Henk F. Moed,et al.  Measuring contextual citation impact of scientific journals , 2009, J. Informetrics.

[2]  Thed N. van Leeuwen,et al.  Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? , 2002, Scientometrics.

[3]  Thed N. van Leeuwen,et al.  Rivals for the crown: Reply to Opthof and Leydesdorff , 2010, J. Informetrics.

[4]  Loet Leydesdorff,et al.  Caveats for the journal and field normalizations in the CWTS ("Leiden") evaluations of research performance , 2010, J. Informetrics.

[5]  Lutz Bornmann,et al.  Towards an ideal method of measuring research performance: Some comments to the Opthof and Leydesdorff (2010) paper , 2010, J. Informetrics.

[6]  Tibor Braun,et al.  Cross-field normalization of scientometric indicators , 1996, Scientometrics.

[7]  Tibor Braun,et al.  Reference standards for citation based assessments , 2005, Scientometrics.

[8]  Jonathan Adams,et al.  Calibrating the zoom — a test of Zitt’s hypothesis , 2008, Scientometrics.

[9]  Wolfgang Glänzel,et al.  An item-by-item subject classification of papers published in journals covered by the SSCI database using reference analysis , 2006, Scientometrics.

[10]  Ronald N. Kostoff,et al.  Citation analysis of research performer quality , 2004, Scientometrics.

[11]  Bernhard Mittermaier,et al.  Creation of journal-based publication profiles of scientific institutions — A methodology for the interdisciplinary comparison of scientific research based on the J-factor , 2009, Scientometrics.

[12]  Jonas Lundberg,et al.  Lifting the crown - citation z-score , 2007, J. Informetrics.

[13]  Thed N. van Leeuwen,et al.  New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications , 1995, Scientometrics.

[14]  Andreas Strotmann,et al.  Combining commercial citation indexes and open-access bibliographic databases to delimit highly interdisciplinary research fields for citation analysis , 2010, J. Informetrics.

[15]  Michel Zitt,et al.  Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation , 2005, Scientometrics.

[16]  Koenraad Debackere,et al.  Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance , 2008, Scientometrics.

[17]  C. Lunneborg Data Analysis by Resampling: Concepts and Applications , 1999 .

[18]  Lars Engwall,et al.  Quality and Renewal 2007 : An Overall Evaluation of Research at Uppsala University 2006/2007 , 2007 .

[19]  Linda Butler,et al.  The publishing imperative: the pervasive influence of publication metrics , 2006, Learn. Publ..

[20]  Christoph Neuhaus,et al.  A new reference standard for citation analysis in chemistry and related fields based on the sections of Chemical Abstracts , 2009 .