Advancing science or advancing careers? Researchers’ opinions on success indicators

The way in which we assess researchers has been under the radar in the past few years. Critics argue that current research assessments focus on productivity and that they increase unhealthy pressures on scientists. Yet, the precise ways in which assessments should change is still open for debate. We circulated a survey with Flemish researchers to understand how they work, and how they would rate the relevance of specific indicators used in research assessments. We found that most researchers worked far beyond their expected working schedule. We also found that, although they spent most of their time doing research, respondents wished they could dedicate more time to it and spend less time writing grants and performing other activities such as administrative duties and meetings. When looking at success indicators, we found that indicators related to openness, transparency, quality, and innovation were perceived as highly important in advancing science, but as relatively overlooked in career advancement. Conversely, indicators which denoted of prestige and competition were generally rated as important to career advancement, but irrelevant or even detrimental in advancing science. Open comments from respondents further revealed that, although indicators which indicate openness, transparency, and quality (e.g., publishing open access, publishing negative findings, sharing data, etc.) should ultimately be valued more in research assessments, the resources and support currently in place were insufficient to allow researchers to endorse such practices. In other words, current research assessments are inadequate and ignore practices which are essential in contributing to the advancement of science. Yet, before we change the way in which researchers are being assessed, supporting infrastructures must be put in place to ensure that researchers are able to commit to the activities that may benefit the advancement of science. Submission history The first version of this manuscript was submitted to PLOS ONE on 27th June 2020. Revisions after peer-review were submitted on 4th October 2020 and 20th November 2020.

[1]  Noémie Aubert Bonn,et al.  Rethinking success, integrity, and culture in research (part 2) — a multi-actor qualitative study on problems of science , 2020, Research Integrity and Peer Review.

[2]  Wim Pinxten,et al.  Rethinking success, integrity, and culture in research (part 1) — a multi-actor qualitative study on success in science , 2020, bioRxiv.

[3]  S. Schroter,et al.  Working 9 to 5, not the way to make an academic living: observational analysis of manuscript and peer review submissions over time , 2019, BMJ.

[4]  The mental health of PhD researchers demands urgent attention , 2019, Nature.

[5]  Vinciane Gaillard,et al.  Research Assessment in the Transition to Open Science: 2019 EUA Open Science and Access Survey Results. , 2019 .

[6]  D. Moher,et al.  The Hong Kong Principles for assessing researchers: Fostering research integrity , 2019, PLoS biology.

[7]  V. Traag,et al.  Systematic analysis of agreement between metrics and peer review in the UK REF , 2018, Palgrave Communications.

[8]  R. Hofman,et al.  What motivates researchers? : Research excellence is still a priority , 2018 .

[9]  J. Tregoning How will you judge me if not by impact factor? , 2018, Nature.

[10]  Nathan L. Vanderford,et al.  Evidence for a mental health crisis in graduate education , 2018, Nature Biotechnology.

[11]  David Moher,et al.  Assessing scientists for hiring, promotion, and tenure , 2018, PLoS biology.

[12]  Chris Woolston Graduate survey: A love–hurt relationship , 2017, Nature.

[13]  F. Anseel,et al.  Work organization and mental health problems in PhD students , 2017 .

[14]  Julia N. Thompson,et al.  Burning Out Faculty at Doctoral Research Universities. , 2016, Stress and health : journal of the International Society for the Investigation of Stress.

[15]  Miquel Sureda Anfres,et al.  Young scientists under pressure: what the data show , 2016, Nature.

[16]  Kendall Powell,et al.  Young, talented and fed-up: scientists tell their stories , 2016, Nature.

[17]  I. Kabe,et al.  Validity and reproducibility of self-reported working hours among Japanese male employees , 2016, Journal of occupational health.

[18]  L M Bouter,et al.  How do scientists perceive the current publication culture? A qualitative focus group interview study among Dutch biomedical researchers , 2016, BMJ Open.

[19]  James Wilsdon The Metric Tide: Independent Review of the Role of Metrics in Research Assessment and Management , 2016 .

[20]  Mike Thelwall,et al.  The metric tide: report of the independent review of the role of metrics in research assessment and management , 2015 .

[21]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[22]  Gail M. Sullivan,et al.  Analyzing and interpreting data from likert-type scales. , 2013, Journal of graduate medical education.

[23]  S. Godecharle,et al.  Research evaluation: Flanders overrates impact factors , 2013, Nature.

[24]  Nicholas Graves,et al.  On the time spent preparing grant proposals: an observational study of Australian researchers , 2013, BMJ Open.

[25]  M. Way,et al.  The San Francisco Declaration on Research Assessment , 2013, Journal of Cell Science.

[26]  Brian C. Martinson,et al.  The academic birth rate , 2011, EMBO reports.

[27]  Karen Coats Blue (review) , 2006 .

[28]  L. Butler,et al.  Modifying publication practices in response to funding formulas , 2003 .

[29]  K. Vandevelde,et al.  From PhD to professor in Flanders: ECOOM Brief 11 , 2016 .

[30]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics. , 2015, Nature.

[31]  D. Christakis,et al.  Impact factor: a valid measure of journal quality? , 2003, Journal of the Medical Library Association : JMLA.