Assessing evaluation procedures for individual researchers: The case of the Italian National Scientific Qualification

The Italian National Scientific Qualification (ASN) was introduced as a prerequisite for applying for tenured associate or full professor positions at state-recognized universities. The ASN is meant to attest that an individual has reached a suitable level of scientific maturity to apply for professorship positions. A five member panel, appointed for each scientific discipline, is in charge of evaluating applicants by means of quantitative indicators of impact and productivity, and through an assessment of their research profile. Many concerns were raised on the appropriateness of the evaluation criteria, and in particular on the use of bibliometrics for the evaluation of individual researchers. Additional concerns were related to the perceived poor quality of the final evaluation reports. In this paper we assess the ASN in terms of appropriateness of the applied methodology, and the quality of the feedback provided to the applicants. We argue that the ASN is not fully compliant with the best practices for the use of bibliometric indicators for the evaluation of individual researchers; moreover, the quality of final reports varies considerably across the panels, suggesting that measures should be put in place to prevent sloppy practices in future ASN rounds.

[1]  Kevin J.H. Dettmer What We Waste When Faculty Hiring Goes Wrong , 2004 .

[2]  M. Newman,et al.  The structure of scientific collaboration networks. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Stewart M. Hoover,et al.  Approaches to Understanding , 1998 .

[4]  Richard Ruggles,et al.  An Empirical Approach to Economic Intelligence in World War II , 1947 .

[5]  Loet Leydesdorff,et al.  Betweenness centrality as a driver of preferential attachment in the evolution of research collaboration networks , 2011, J. Informetrics.

[6]  Jürgen Enders,et al.  A chair system in transition: Appointments, promotions, and gate-keeping in German higher education , 2001 .

[7]  Jean-Marie Bach,et al.  ON THE PROPER USE OF BIBLIOMETRICS TO EVALUATE INDIVIDUAL RESEARCHERS , 2011 .

[8]  Giovanni Abramo,et al.  An assessment of the first “scientific habilitation” for university appointments in Italy , 2015, 1810.12723.

[9]  Gábor Csárdi,et al.  The igraph software package for complex network research , 2006 .

[10]  F. J. Rijnsoever,et al.  Factors associated with disciplinary and interdisciplinary research collaboration , 2011 .

[11]  C. I. Bliss,et al.  THE METHOD OF PROBITS. , 1934, Science.

[12]  Karen Shashok,et al.  BMC Medical Research Methodology BioMed Central Debate Conducting a meta-ethnography of qualitative literature: Lessons learnt , 2008 .

[13]  Franck Laloë,et al.  Bibliometric evaluation of individual researchers: not even right... not even wrong! , 2009 .

[14]  Yannis Manolopoulos,et al.  Generalized Hirsch h-index for disclosing latent facts in citation networks , 2007, Scientometrics.

[15]  Gerardo Canfora,et al.  Report on the Technical Track of ICSE 2015 , 2015 .

[16]  Massimo Gerosa Competition for academic promotion in Italy , 2001, The Lancet.

[17]  Amy Raphael,et al.  The Complete Academic Search Manual: A Systematic Approach to Successful and Inclusive Hiring (review) , 2005 .

[18]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[19]  M. Pautasso,et al.  The Italian University Habilitation and the Challenge of Increasing the Representation of Women in Academia , 2015 .

[20]  Yoshiko Okubo,et al.  Bibliometric indicators and analysis of research systems , 1997 .

[21]  Moreno Marzolla Quantitative analysis of the Italian National Scientific Qualification , 2015, J. Informetrics.

[22]  Marco Geraci,et al.  Thirty Years of Higher-education Policy in Italy: Vico's Ricorsi and Beyond? , 2010 .

[23]  Kevin W. Boyack,et al.  Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature , 2011, J. Informetrics.

[24]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[25]  Marian Thunnissen,et al.  Talent management in academia: performance systems and HRM policies , 2013 .

[26]  J. Sahel Quality Versus Quantity: Assessing Individual Research Performance , 2011, Science Translational Medicine.

[27]  Mathieu Bastian,et al.  Gephi: An Open Source Software for Exploring and Manipulating Networks , 2009, ICWSM.

[28]  Christine Musselin,et al.  Towards a European Academic Labour Market? Some Lessons Drawn from Empirical Studies on Academic Mobility , 2004 .

[29]  Appropriate use of bibliometric indicators for the assessment of journals, research proposals, and individuals. , 2014, IEEE computer graphics and applications.

[30]  Andrew Koenig,et al.  Patterns and Antipatterns , 1998, J. Object Oriented Program..

[31]  Jerome K. Vanclay,et al.  An evaluation of the Australian Research Council's journal ranking , 2010, J. Informetrics.

[32]  William Locke,et al.  The Early Career Paths and Employment Conditions of the Academic Profession in 17 Countries , 2010, European Review.

[33]  Yoshiko Okubo,et al.  Bibliometric indicators and analysis of research systems : methods and examples , 1997 .

[34]  Pedro Albarrán,et al.  The skewness of science in 219 sub-fields and a number of aggregates , 2010, Scientometrics.

[35]  Neil R. Smalheiser,et al.  Author name disambiguation , 2009, Annu. Rev. Inf. Sci. Technol..

[36]  M. Jennions,et al.  The h index and career assessment by numbers. , 2006, Trends in ecology & evolution.