Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?

This paper introduces a citation-based "systems approach" for analyzing the various institutional and cognitive dimensions of scientific excellence within national research systems. The methodology, covering several aggregate levels, focuses on the most highly cited research papers in the international journal literature. The distribution of these papers across institutions and disciplines enables objective comparisons their (possible) international-level scientific excellence. By way of example, we present key results from a recent series of analyses of the research system in the Netherlands in the mid 1990s, focussing on the performance of the universities across the various major scientific disciplines within the context of the entire system"s scientific performance. Special attention is paid to the contribution in the world"s top 1% and top 10% most highly cited research papers. The findings indicate that these high performance papers provide a useful analytical framework - both in terms of transparency, cognitive and institutional differentiation, as well as its scope for domestic and international comparisons - providing new indicators for identifying "world class" scientific excellence at the aggregate level. The average citation scores of these academic "Centres of Scientific Excellence" appear to be an inadequate predictor of their production of highly cited papers. However, further critical reflection and in-depth validation studies are needed to establish the true potential of this approach for science policy analyses and evaluation of research performance.

[1]  Michael H. MacRoberts,et al.  Problems of citation analysis: A critical review , 1989, JASIS.

[2]  D. Edge Quantitative Measures of Communication in Science: A Critical Review , 1979, History of science; an annual review of literature, research and teaching.

[3]  Loet Leydesdorff,et al.  Theories of Citation , 1998 .

[4]  Elizabeth Aversa,et al.  The web of knowledge: a festschrift in honor of Eugene Garfield , 2001 .

[5]  Terttu Luukkonen,et al.  Why has Latour's theory of citations been ignored by the bibliometric community? discussion of sociological interpretations of citation analysis , 2006, Scientometrics.

[6]  Henk F. Moed,et al.  Trends in publication output and impact of universities in the Netherlands , 1999 .

[7]  Daryl E. Chubin,et al.  Scientists in Organizations: Productive Climates for Research and Development , 1967 .

[8]  Ed J. Rinia,et al.  COMPARATIVE ANALYSIS OF A SET OF BIBLIOMETRIC INDICATORS AND CENTRAL PEER REVIEW CRITERIA. EVALUATION OF CONDENSED MATTER PHYSICS IN THE NETHERLANDS , 1998 .

[9]  H. Inhaber,et al.  Quality of Research and the Nobel Prizes , 1976 .

[10]  J. Ravetz Sociology of Science , 1972, Nature.

[11]  Susan E. Cozzens,et al.  What do citations count? the rhetoric-first model , 1989, Scientometrics.

[12]  Harold Maurice Collins Knowledge, Norms and Rules in the Sociology of Science , 1982 .

[13]  Dag W. Aksnes,et al.  The effect of highly cited papers on national citation indicators , 2004, Scientometrics.

[14]  Per O. Seglen,et al.  The Skewness of Science , 1992, J. Am. Soc. Inf. Sci..

[15]  A. Rip,et al.  Mediation in the Dutch science system , 1998 .

[16]  D F Horrobin,et al.  The philosophical basis of peer review and the suppression of innovation. , 1990, JAMA.

[17]  Tapas Samanta,et al.  Number 23 , 2019 .

[18]  Ben R. Martin,et al.  The use of multiple indicators in the assessment of basic research , 1996, Scientometrics.

[19]  Arie Rip,et al.  The post-modern research system , 1996 .