Is there agreement on the prestige of scholarly book publishers in the Humanities? DELPHI over survey results

Despite having an important role supporting assessment processes, criticism towards evaluation systems and the categorizations used are frequent. Considering the acceptance by the scientific community as an essential issue for using rankings or categorizations in research evaluation, the aim of this paper is testing the results of rankings of scholarly book publishers' prestige, Scholarly Publishers Indicators (SPI hereafter). SPI is a public, survey-based ranking of scholarly publishers' prestige (among other indicators). The latest version of the ranking (2014) was based on an expert consultation with a large number of respondents. In order to validate and refine the results for Humanities' fields as proposed by the assessment agencies, a Delphi technique was applied with a panel of randomly selected experts over the initial rankings. The results show an equalizing effect of the technique over the initial rankings as well as a high degree of concordance between its theoretical aim (consensus among experts) and its empirical results (summarized with Gini Index). The resulting categorization is understood as more conclusive and susceptible of being accepted by those under evaluation.

[1]  Sven E. Hug,et al.  Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history , 2013 .

[2]  Andrea Bonaccorsi,et al.  How robust is journal rating in Humanities and Social Sciences? Evidence from a large-scale, multi-method exercise , 2016 .

[3]  N. Mohaghegh,et al.  WHY THE IMPACT FACTOR OF JOURNALS SHOULD NOT BE USED FOR EVALUATING RESEARCH , 2005 .

[4]  Tim C. E. Engels,et al.  A label for peer-reviewed books , 2013, J. Assoc. Inf. Sci. Technol..

[5]  Peter Weingart,et al.  Impact of bibliometrics upon the science system: Inadvertent consequences? , 2005, Scientometrics.

[6]  Peter Ingwersen,et al.  Taking scholarly books into account: current developments in five European countries , 2016, Scientometrics.

[7]  Carlton Gyles Journal impact. , 2017, The Canadian veterinary journal = La revue veterinaire canadienne.

[8]  Mike Thelwall,et al.  The metric tide: report of the independent review of the role of metrics in research assessment and management , 2015 .

[9]  Péter Jacsó Comparison of journal impact rankings in the SCImago Journal & Country Rank and the Journal Citation Reports databases , 2010, Online Inf. Rev..

[10]  Jorge Mañana-Rodríguez,et al.  A critical review of SCImago Journal & Country Rank , 2015 .

[11]  E. Giménez-Toledo,et al.  Evaluation of scientific books’ publishers in social sciences and humanities: Results of a survey , 2013 .

[12]  Péter Jacsó Grim tales about the impact factor and the h-index in the Web of Science and the Journal Citation Reports databases: reflections on Vanclay’s criticism , 2012, Scientometrics.