Embracing the principles of the San Francisco Declaration of Research Assessment: Robert Balaban’s Editorial
暂无分享,去创建一个
Early career scientists today face many obstacles in their effort to achieve an independent career, including stiff competition for grant funding and jobs. Many of these hurdles reappear throughout one’s scientific career; thus, for example, achieving and maintaining external funding for one’s research is a widely used criterion in evaluations for academic advancement. One widely perceived barrier to scientific success and career advancement is the necessity of having recent publications in “high impact” journals. Most of us have had personal experiences with grant review panels, academic search committees, and academic promotion committees in which the evaluation of scientific productivity appears to be reduced to journal impact-factor arithmetic.
There is a growing realization in the scientific community that the use of the journal impact factor as an evaluation criterion for the quality of the science in an article is harmful not only to individuals who are being evaluated for jobs, promotions, and grants but, indeed, to the very fabric of science, as commented upon previously in these pages (Andersen, 2008). Indeed, a large body of scientists, including ourselves, has taken an open stand on this matter in signing the San Francisco Declaration of Research Assessment (DORA). DORA lays out a set of principles for the assessment of science and scientists to which its signatories have subscribed, and it has inspired insightful editorials and commentaries (see “News about DORA” for links to some of these pieces).
Here we rejoin the chorus calling for the scientific community in its actions on search committees, promotion committees, and study sections to refocus their evaluations on the contents and substance of publications, rather than on their venue. Given the broad distribution of scientific evaluation, it is impractical to expect a top-down reformation in how science is evaluated. However, one department, one panel, one committee at a time, we can experiment with new approaches, holding each of these to criteria we publicly acknowledge. As part of this effort, in this issue we open our editorial pages as a forum for discussion by scientific leaders in the physiology community. In the first guest editorial devoted to this topic, Dr. Robert Balaban, scientific director of the Division of Intramural Research at National Heart, Lung, and Blood Institute (NHLBI), shares his values for evaluating the research productivity and describes how he has implemented these values at NHLBI. To effect a renewal of the evaluation process, we scientists need to acknowledge that we have been complicit in allowing publication venue to serve as a surrogate for quality. Then, the grass-roots effort underway in DORA could result in a community in which the quality of the research is the prime criterion of scientific merit. As Marc Kirschner so eloquently put it, “The scientific community must create leadership with the courage and independence to take control of the structure of its training, the peer review of its journals, the organization of grant review panels, and the overall priorities that are set” (Kirschner, 2013).
[1] M. Kirschner. A Perverted View of “Impact” , 2013, Science.
[2] O. Andersen. Editorial Practices, Scientific Impact, and Scientific Quality , 2008, The Journal of General Physiology.