Analyzing the disciplinary focus of universities: Can rankings be a one-size-fits-all?

The phenomenon of rankings is intimately related with the government interest in fiscalizing the research outputs of universities. New forms of managerialism have been introduced into the higher education system, leading to an increasing interest from funding bodies in developing external evaluation tools to allocate funds. Rankings rely heavily on bibliometric indicators. But bibliometricians have been very critical with their use. Among other, they have pointed out the over-simplistic view rankings represent when analyzing the research output of universities, as they consider them as homogeneous ignoring disciplinary differences. Although many university rankings now include league tables by fields, reducing the complex framework of universities' research activity to a single dimension leads to poor judgment and decision making. This is partly because of the influence disciplinary specialization has on research evaluation. This chapter analyzes from a methodological perspective how rankings suppress disciplinary differences which are key factors to interpret correctly these rankings.

[1]  Ismael Rafols,et al.  How journal rankings can suppress interdisciplinary research: A comparison between Innovation Stud , 2012 .

[2]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics. , 2015, Nature.

[3]  Simon Marginson,et al.  To Rank or To Be Ranked: The Impact of Global Rankings in Higher Education , 2007 .

[4]  Thed N. van Leeuwen,et al.  Language biases in the coverage of the Science Citation Index and its consequencesfor international comparisons of national research performance , 2001, Scientometrics.

[5]  A. Mas-Colell,et al.  The Governance and Performance of Universities: Evidence from Europe and the US , 2010 .

[6]  Lutz Bornmann,et al.  Multilevel-statistical reformulation of citation-based university rankings: The Leiden ranking 2011/2012 , 2013, J. Assoc. Inf. Sci. Technol..

[7]  Tindaro Cicero,et al.  The dangers of performance-based research funding in non-competitive higher education systems , 2011, Scientometrics.

[8]  Emilio Delgado López-Cózar,et al.  The evolution of research activity in Spain , 2003, Research Policy.

[9]  Henk F. Moed,et al.  Combining Mapping and Citation Analysis for Evaluative Bibliometric Purposes: A Bibliometric Study , 1999, J. Am. Soc. Inf. Sci..

[10]  Nicholas Rescher,et al.  Epistemetrics: Index of Names , 2006 .

[11]  D. Hicks Performance-based university research funding systems , 2012 .

[12]  A. V. van Raan,et al.  Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods , 2005 .

[13]  Henk F. Moed,et al.  Citation Analysis in Research Evaluation , 1899 .

[14]  Nian Cai Liu,et al.  The Story of Academic Ranking of World Universities , 2015 .

[15]  D. Torres-Salinas,et al.  Análisis de redes de las universidades españolas de acuerdo a su perfil de publicación en revistas por áreas científicas , 2013 .

[16]  Ellen Hazelkorn,et al.  Rankings and the Reshaping of Higher Education , 2011 .

[17]  Lars Iselid,et al.  Web of Science and Scopus: a journal title overlap study , 2008, Online Inf. Rev..

[18]  Nicolás Robinson-García,et al.  The Google scholar experiment: How to index false papers and manipulate bibliometric indicators , 2013, J. Assoc. Inf. Sci. Technol..

[19]  Simon Marginson,et al.  University Rankings in Critical Perspective , 2013 .

[20]  V. Bush Science, the Endless Frontier , 1999, Science, the Endless Frontier.

[21]  P. Peres,et al.  Assessing assessment , 2009 .

[22]  K. Yokoyama The effect of the research assessment exercise on organisational culture in English universities: collegiality versus managerialism , 2006 .

[23]  B. Martin,et al.  University Research Evaluation and Funding: An International Comparison , 2003 .

[24]  Diana Hicks,et al.  Evolving regimes of multi-university research evaluation , 2009 .

[25]  Norman Kaplan,et al.  The Sociology of Science: Theoretical and Empirical Investigations , 1974 .

[26]  A. House,et al.  ‘Impact’ in the proposals for the UK's Research Excellence Framework: Shifting the boundaries of academic autonomy , 2011 .

[27]  Ben R. Martin,et al.  The Research Excellence Framework and the ‘impact agenda’: are we creating a Frankenstein monster? , 2011 .

[28]  D. Price Little Science, Big Science , 1965 .

[29]  D J PRICE,et al.  NETWORKS OF SCIENTIFIC PAPERS. , 1965, Science.

[30]  Frederic S. Lee,et al.  The UK research assessment exercise and the narrowing of UK economics , 2013 .

[31]  Aldo Geuna,et al.  The Economics of Knowledge Production: Funding and the Structure of University Research , 1999 .

[32]  Enrique Orduña-Malea,et al.  Does Google Scholar contain all highly cited documents (1950-2013)? , 2014, ArXiv.

[33]  Rosa Sancho,et al.  Perfil de actividad científica de las universidades españolas en cuatro áreas temáticas: un enfoque multifactorial , 2010 .

[34]  José Luis Ortega,et al.  Comparing university rankings , 2010, Scientometrics.

[35]  Anthony F. J. van Raan Reply to the comments of Liu et al. , 2005, Scientometrics.

[36]  Félix de Moya Anegón,et al.  Visualizing the structure of science , 2007 .

[37]  Philippe Laredo,et al.  Revisiting the Third Mission of Universities: Toward a Renewed Categorization of University Activities? , 2007 .

[38]  Henk F. Moed,et al.  The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review , 2007 .

[39]  Lokman I. Meho,et al.  Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar , 2007, J. Assoc. Inf. Sci. Technol..

[40]  José Luis Ortega,et al.  Multivariate approach to classify research institutes according to their outputs: The case of the CSIC's institutes , 2011, J. Informetrics.

[41]  Nicholas A. Bowman,et al.  U.S. News & World Report College Rankings: Modeling Institutional Effects on Organizational Reputation , 2009, American Journal of Education.

[42]  Peder Olesen Larsen,et al.  The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index , 2010, Scientometrics.

[43]  J. Schneider An Outline of the Bibliometric Indicator Used for Performance-Based Funding of Research Institutions in Norway , 2009 .

[44]  Richard Van Noorden,et al.  Metrics: Do metrics matter? , 2010, Nature.

[45]  Evaristo Jiménez-Contreras,et al.  Reviewers’ Ratings and Bibliometric Indicators: Hand in Hand When Assessing Over Research Proposals? , 2013, PloS one.

[46]  M Worton,et al.  What are universities for in 21st century , 2012 .

[47]  L. Leydesdorff,et al.  The dynamics of innovation: from National Systems and , 2000 .

[48]  Nicolas Robinson-Garcia,et al.  What do university rankings by fields rank? Exploring discrepancies between the organizational structure of universities and bibliometric classifications , 2013, Scientometrics.

[49]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[50]  Henk F. Moed,et al.  UK Research Assessment Exercises: Informed judgments on research quality or quantity? , 2008, Scientometrics.

[51]  H. Albrechtsen,et al.  Toward a New Horizon in Information Science: Domain-Analysis , 1995, J. Am. Soc. Inf. Sci..

[52]  Geoffrey Boulton,et al.  What are universities for? , 2011 .

[53]  Nicolás Robinson-García,et al.  An insight into the importance of national university rankings in an international context: the case of the I-UGR rankings of Spanish universities , 2014, Scientometrics.

[54]  Ismael Rafols,et al.  How journal rankings can suppress interdisciplinarity. The case of innovation studies in business and management , 2011, ArXiv.

[55]  Katharine Barker,et al.  The UK Research Assessment Exercise: the evolution of a national research evaluation system , 2007 .

[56]  Thed N. van Leeuwen,et al.  Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data , 2016 .

[57]  P. Wouters The citation culture , 1999 .

[58]  E Garfield,et al.  "Science Citation Index"--A New Dimension in Indexing. , 1964, Science.

[59]  Thed N. van Leeuwen,et al.  The Leiden ranking 2011/2012: Data collection, indicators, and interpretation , 2012, J. Assoc. Inf. Sci. Technol..

[60]  Jean-Charles Billaut,et al.  Should you believe in the Shanghai ranking? , 2010, Scientometrics.

[61]  Zaida Chinchilla-Rodríguez,et al.  Coverage analysis of Scopus: A journal metric approach , 2007, Scientometrics.

[62]  Rüdiger Mutz,et al.  What is behind the curtain of the Leiden Ranking? , 2015, J. Assoc. Inf. Sci. Technol..

[63]  Daniel Torres-Salinas,et al.  Mapping Academic Institutions According to Their Journal Publication Profile: Spanish Universities as a Case Study , 2012, J. Assoc. Inf. Sci. Technol..

[64]  Mika Nieminen,et al.  University research funding and publication performance--An international comparison , 2010 .

[65]  Li Liu,et al.  Academic ranking of world universities using scientometrics  - A comment to the “Fatal Attraction” , 2005, Scientometrics.

[66]  Lewis Elton,et al.  The UK Research Assessment Exercise: Unintended Consequences , 2000 .

[67]  Henry Etzkowitz The Second Academic Revolution: The Role of the Research University in Economic Development , 1990 .

[68]  N. Morris The developing role of departments , 2002 .

[69]  Peter Ingwersen,et al.  Mapping national research profiles in social science disciplines , 2001, J. Documentation.

[70]  Anthony F. J. van Raan,et al.  Citation Analysis May Severely Underestimate the Impact of Clinical Research as Compared to Basic Research , 2012, PloS one.

[71]  Domingo Docampo,et al.  On using the Shanghai ranking to assess the research performance of university systems , 2010, Scientometrics.

[72]  J. Ioannidis,et al.  International ranking systems for universities and institutions: a critical appraisal , 2007, BMC medicine.

[73]  C. M. Noyons Science Maps Within a Science Policy Context , 2004 .

[74]  Kevin W. Boyack,et al.  Toward a consensus map of science , 2009, J. Assoc. Inf. Sci. Technol..

[75]  M. Nedeva,et al.  Policy Pressures and the Changing Organisation of University Research , 2014 .

[76]  Alexandros Polycarpou,et al.  Peer Review vs Metric‐Based Assessment: Testing for Bias in the Rae Ratings of UK Economics Departments , 2009 .

[77]  Katy Börner,et al.  Mapping knowledge domains , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[78]  Gualberto Buela-Casal,et al.  Comparative study of international academic rankings of universities , 2007, Scientometrics.

[79]  Henk F. Moed,et al.  A ranking of universities should account for differences in their disciplinary specialization , 2011, Scientometrics.

[80]  Matthew E Falagas,et al.  Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses , 2007, FASEB journal : official publication of the Federation of American Societies for Experimental Biology.

[81]  Lutz Bornmann,et al.  Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers: A visualisation of results from multi-level models , 2012, Online Inf. Rev..

[82]  Darryl M. Tyndorf Unwarranted Stigma: Economic Impact of Community College Education , 2019 .

[83]  Thed N. van Leeuwen,et al.  Modelling of bibliometric approaches and importance of output verification in research performance assessment , 2007 .

[84]  Jung Cheol Shin,et al.  Classifying higher education institutions in Korea: a performance-based approach , 2009 .

[85]  Jillian R. Griffiths,et al.  Communicating knowledge: how and why UK researchers publish and disseminate their findings , 2009 .

[86]  Nian Cai Liu,et al.  The Academic Ranking of World Universities. , 2005 .

[87]  H. Moed,et al.  The use of bibliometric data for the measurement of university research performance , 1985 .