Open Evaluation: A Vision for Entirely Transparent Post-Publication Peer Review and Rating for Science

The two major functions of a scientific publishing system are to provide access to and evaluation of scientific papers. While open access (OA) is becoming a reality, open evaluation (OE), the other side of the coin, has received less attention. Evaluation steers the attention of the scientific community and thus the very course of science. It also influences the use of scientific findings in public policy. The current system of scientific publishing provides only journal prestige as an indication of the quality of new papers and relies on a non-transparent and noisy pre-publication peer-review process, which delays publication by many months on average. Here I propose an OE system, in which papers are evaluated post-publication in an ongoing fashion by means of open peer review and rating. Through signed ratings and reviews, scientists steer the attention of their field and build their reputation. Reviewers are motivated to be objective, because low-quality or self-serving signed evaluations will negatively impact their reputation. A core feature of this proposal is a division of powers between the accumulation of evaluative evidence and the analysis of this evidence by paper evaluation functions (PEFs). PEFs can be freely defined by individuals or groups (e.g., scientific societies) and provide a plurality of perspectives on the scientific literature. Simple PEFs will use averages of ratings, weighting reviewers (e.g., by H-index), and rating scales (e.g., by relevance to a decision process) in different ways. Complex PEFs will use advanced statistical techniques to infer the quality of a paper. Papers with initially promising ratings will be more deeply evaluated. The continual refinement of PEFs in response to attempts by individuals to influence evaluations in their own favor will make the system ungameable. OA and OE together have the power to revolutionize scientific publishing and usher in a new culture of transparency, constructive criticism, and collaboration.

[1]  F. Godlee Making reviewers visible: openness, accountability, and credit. , 2002, JAMA.

[2]  G. Wilkinson,et al.  Open peer review: A randomised controlled trial , 2000, British Journal of Psychiatry.

[3]  Stevan Harnad,et al.  Scholarly Skywriting and the Prepublication Continuum of Scientific Inquiry , 1990 .

[4]  Katsiaryna Mirylenka,et al.  Alternatives to Peer Review: Novel Approaches for Research Evaluation , 2011, Front. Comput. Neurosci..

[5]  Mehrbakhsh Nilashi,et al.  Collaborative filtering recommender systems , 2013 .

[6]  Bradley M. Hemminger,et al.  Decoupling the scholarly journal , 2011, Front. Comput. Neurosci..

[7]  Tony Delamothe,et al.  Effect on peer review of telling reviewers that their signed reviews might be posted on the web: randomised controlled trial , 2010, BMJ : British Medical Journal.

[8]  Michael A. Nielsen Doing science in the open , 2009 .

[9]  J. E. Hirsch,et al.  An index to quantify an individual's scientific research output , 2005, Proc. Natl. Acad. Sci. USA.

[10]  F. Godlee,et al.  Effect of open peer review on quality of reviews and on reviewers'recommendations: a randomised trial , 1999, BMJ.

[11]  Razvan V. Florian Aggregating post-publication peer reviews and ratings , 2012, Front. Comput. Neurosci..

[12]  John W. T. Smith,et al.  The deconstructed journal – a new model for academic publishing , 2004 .

[13]  Gabriel Kreiman,et al.  Nine Criteria for a Measure of Scientific Output , 2011, Front. Comput. Neurosci..

[14]  Richard Smith,et al.  Peer Review: A Flawed Process at the Heart of Science and Journals , 2006, Journal of the Royal Society of Medicine.

[15]  Talis Bachmann Fair and Open Evaluation May Call for Temporarily Hidden Authorship, Caution When Counting the Votes, and Transparency of the Full Pre-publication Procedure , 2011, Front. Comput. Neurosci..

[16]  Stevan Harnad Learned inquiry and the Net: the role of peer review, peer commentary and copyright , 1998, Learn. Publ..

[17]  Jorge E. Hirsch,et al.  An index to quantify an individual’s scientific research output that takes into account the effect of multiple coauthorship , 2009, Scientometrics.

[18]  Peter Frishauf,et al.  Reputation Systems: A New Vision for Publishing and Peer Review , 2009 .

[19]  Alexander Walther,et al.  FOSE: a framework for open science evaluation , 2012, Front. Comput. Neurosci..

[20]  Erik Sandewall Maintaining Live Discussion in Two-Stage Open Peer Review , 2012, Front. Comput. Neurosci..

[21]  Timo Hannay,et al.  Nature's Peer Review Debate , 2006 .

[22]  Ulrich Pöschl,et al.  Multi-Stage Open Peer Review: Scientific Evaluation Integrating the Strengths of Traditional Peer Review with the Virtues of Transparency and Self-Regulation , 2012, Front. Comput. Neurosci..

[23]  Thomas Vogt,et al.  Reinventing Discovery: The New Era of Networked Science , 2012 .

[24]  T. Groves Is open peer review the fairest system? Yes , 2010, BMJ : British Medical Journal.

[25]  Mark Ware,et al.  Peer Review: Recent Experience and Future Directions , 2011 .

[26]  Karim Khan Is open peer review the fairest system? No , 2010, BMJ : British Medical Journal.

[27]  Chris I. Baker,et al.  Toward a New Model of Scientific Publishing: Discussion and a Proposal , 2011, Front. Comput. Neurosci..

[28]  Denny Borsboom,et al.  Letting the daylight in: Reviewing the reviewers and other ways to maximize transparency in science , 2012, Front. Comput. Neurosci..

[29]  Richard W. Smith In Search Of an Optimal Peer Review System , 2009 .

[30]  Rainer Goebel,et al.  Network-based statistics for a community driven transparent publication process , 2011, Front. Comput. Neurosci..

[31]  Axel Boldt,et al.  Extending ArXiv.org to Achieve Open Peer Review and Publishing , 2010, ArXiv.

[32]  C. Neylon,et al.  Article-Level Metrics and the Evolution of Scientific Impact , 2009, PLoS biology.

[33]  Stevan Harnad The Open Challenge: A Brief History , 2010 .

[34]  Douglas B. Terry,et al.  Using collaborative filtering to weave an information tapestry , 1992, CACM.

[35]  Ulrich Pöschl,et al.  Interactive Open Access Publishing and Peer Review: The Effectiveness and Perspectives of Transparency and Self-Regulation in Scientific Communication and Evaluation , 2010 .

[36]  David Heckerman,et al.  Empirical Analysis of Predictive Algorithms for Collaborative Filtering , 1998, UAI.

[37]  Bernd Pulverer,et al.  Transparency showcases strength of peer review , 2010, Nature.

[38]  Christopher J. Lee Open Peer Review by a Selected-Papers Network , 2011, Front. Comput. Neurosci..