The suitability of h and g indexes for measuring the research performance of institutions

It is becoming ever more common to use bibliometric indicators to evaluate the performance of research institutions, however there is often a failure to recognize the limits and drawbacks of such indicators. Since performance measurement is aimed at supporting critical decisions by research administrators and policy makers, it is essential to carry out empirical testing of the robustness of the indicators used. In this work we examine the accuracy of the popular “h” and “g” indexes for measuring university research performance by comparing the ranking lists derived from their application to the ranking list from a third indicator that better meets the requirements for robust and reliable assessment of institutional productivity. The test population is all Italian universities in the hard sciences, observed over the period 2001–2005. The analysis quantifies the correlations between the three university rankings (by discipline) and the shifts that occur under changing indicators, to measure the distortion inherent in use of the h and g indexes and their comparative accuracy for assessing institutions.

[1]  David F Kallmes,et al.  Is the h-index predictive of greater NIH funding success among academic radiologists? , 2011, Academic radiology.

[2]  Cristiano Giuffrida,et al.  A heuristic approach to author name disambiguation in bibliometrics databases for large-scale research assessments , 2011, J. Assoc. Inf. Sci. Technol..

[3]  Xia Gao,et al.  Comparison and evaluation of Chinese research performance in the field of bioinformatics , 2008, Scientometrics.

[4]  Themis Lazaridis,et al.  Ranking university departments using the mean h-index , 2010, Scientometrics.

[5]  Larsen Gauffriau,et al.  Counting Methods & University Ranking by H-Index , 2011 .

[6]  Chun-Ting Zhang,et al.  The e-Index, Complementing the h-Index for Excess Citations , 2009, PloS one.

[7]  Wolfgang Glänzel,et al.  A Hirsch-type index for journals , 2006, Scientometrics.

[8]  J. E. Hirsch,et al.  An index to quantify an individual's scientific research output , 2005, Proc. Natl. Acad. Sci. USA.

[9]  Martin Ravallion,et al.  On measuring scholarly influence by citations , 2011, Scientometrics.

[10]  Jean-François Molinari,et al.  Mathematical aspects of a new criterion for ranking scientific institutions based on the h-index , 2008, Scientometrics.

[11]  Giovanni Abramo,et al.  National-scale research performance assessment at the individual level , 2011, Scientometrics.

[12]  Pablo Jensen,et al.  Testing bibliometric indicators by their prediction of scientists promotions , 2008, Scientometrics.

[13]  Ludo Waltman,et al.  The inconsistency of the h-index , 2011, J. Assoc. Inf. Sci. Technol..

[14]  Anthony F. J. van Raan Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups , 2013, Scientometrics.

[15]  Mu-Hsuan Huang,et al.  A two-dimensional approach to performance evaluation for a large number of research institutions , 2012, J. Assoc. Inf. Sci. Technol..

[16]  J. Hirsch Does the h index have predictive power? , 2007, Proceedings of the National Academy of Sciences.

[17]  L. Egghe,et al.  Theory and practise of the g-index , 2006, Scientometrics.

[18]  魏屹东,et al.  Scientometrics , 2018, Encyclopedia of Big Data.

[19]  Johannes Hönekopp,et al.  Future publication success in science is better predicted by traditional measures than by the h index , 2011, Scientometrics.

[20]  M. Schreiber,et al.  Exploratory factor analysis for the Hirsch index, 17 h-type variants, and some traditional bibliometric indicators , 2012, J. Informetrics.

[21]  Richard L. Gorsuch Exploratory Factor Analysis , 1988 .

[22]  Mônica G. Campiteli,et al.  Is it possible to compare researchers with different scientific interests? , 2006, Scientometrics.

[23]  Jerome K. Vanclay,et al.  Ranking forestry journals using the h-index , 2007, J. Informetrics.

[24]  Giovanni Abramo,et al.  Assessing the accuracy of the h- and g-indexes for measuring researchers' productivity , 2013, J. Assoc. Inf. Sci. Technol..

[25]  Tindaro Cicero,et al.  A sensitivity analysis of researchers' productivity rankings to the time of citation observation , 2012, J. Informetrics.

[26]  Juan E. Iglesias,et al.  Scaling the h-index for different scientific ISI fields , 2006, Scientometrics.

[27]  Thierry Marchant,et al.  Ranking scientists and departments in a consistent manner , 2011, J. Assoc. Inf. Sci. Technol..

[28]  Nongyao Premkamolnetr,et al.  Research productivity and impact of ASEAN countries and universities in the field of energy and fuel , 2011 .

[29]  Ronald Rousseau,et al.  Probing the h-core: an investigation of the tail–core ratio for rank distributions , 2010, Scientometrics.

[30]  Claus-Christian Carbon,et al.  The Carbon_h-Factor: Predicting Individuals' Research Impact at Early Stages of Their Career , 2011, PloS one.

[31]  Claudio Castellano,et al.  Universality of citation distributions: Toward an objective measure of scientific impact , 2008, Proceedings of the National Academy of Sciences.

[32]  Jonas Lundberg,et al.  Lifting the crown - citation z-score , 2007, J. Informetrics.

[33]  Mônica G. Campiteli,et al.  An index to quantify an individual's scientific research valid across disciplines , 2005 .

[34]  Janet Kleber,et al.  Sometimes the impact factor outshines the H index , 2008, Retrovirology.

[35]  Peter Vinkler,et al.  Would it be possible to increase the Hirsch-index, π-index or CDS-index by increasing the number of publications or citations only by unity? , 2013, J. Informetrics.

[36]  Fiorenzo Franceschini,et al.  Structured evaluation of the scientific output of academic research groups by recent h-based indicators , 2011, J. Informetrics.

[37]  V. I. Cheadle,et al.  AN EVALUATION OF STUDIES ON ULTRASTRUCTURE OF TONOPLAST IN SIEVE ELEMENTS. , 1962, Proceedings of the National Academy of Sciences of the United States of America.