Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1) reviewers are expected to have comprehensive expertise; (2) reviewers do not have sufficient access to methods and materials to evaluate a study; (3) reviewers are neither identified nor acknowledged; (4) there is no measure of the quality of a review; and (5) reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer's specialty or area of interest and place less of a burden on any one reviewer. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, greater scrutiny, and replication of results. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be used to establish the impact of the reviewer in the scientific community. Quantifying review quality could help establish the importance of individual reviews and reviewers as well as the submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialog to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be implemented by adapting existing features from open-source software management and social networking technologies. We propose a model of an open, interactive review system that quantifies the significance of articles, the quality of reviews, and the reputation of reviewers.
[1]
Martin Wattenberg,et al.
Studying cooperation and conflict between authors with history flow visualizations
,
2004,
CHI.
[2]
C. Drummond.
Replicability is not Reproducibility:Nor is it Good Science
,
2009
.
[3]
Johan Bollen,et al.
A Principal Component Analysis of 39 Scientific Impact Measures
,
2009,
PloS one.
[4]
Ulrich Pöschl,et al.
Interactive open access publishing and collaborative peer review for improved scientific communication and quality assurance
,
2007,
Inf. Serv. Use.
[5]
Declan Butler,et al.
Electronic notebooks: A new leaf
,
2005,
Nature.
[6]
J. E. Hirsch,et al.
An index to quantify an individual's scientific research output
,
2005,
Proc. Natl. Acad. Sci. USA.
[7]
E GARFIELD,et al.
Citation indexes for science; a new dimension in documentation through association of ideas.
,
2006,
Science.
[8]
Retraction Kiehntopf,et al.
Retraction
,
1997,
Concurr. Comput. Pract. Exp..
[9]
Jr Carl Leubsdorf.
Annotum: An open-source authoring and publishing platform based on WordPress
,
2011
.
[10]
Roger D Peng,et al.
Reproducible research and Biostatistics.
,
2009,
Biostatistics.
[11]
J. Ioannidis.
Why Most Published Research Findings Are False
,
2005,
PLoS medicine.
[12]
E. Garfield.
Citation indexes for science. A new dimension in documentation through association of ideas. 1955.
,
1955,
International journal of epidemiology.
[13]
Richard Smith,et al.
Peer Review: A Flawed Process at the Heart of Science and Journals
,
2006,
Journal of the Royal Society of Medicine.