CrowdRev : A platform for Crowd-based Screening of Literature

In this paper and demo we present a crowd and crowd+AI based system, called CrowdRev, supporting the screening phase of literature reviews and achieving the same quality as author classification at a fraction of the cost, and near-instantly. CrowdRev makes it easy for authors to leverage the crowd, and ensures that no money is wasted even in the face of difficult papers or criteria: if the system detects that the task is too hard for the crowd, it just gives up trying (for that paper, or for that criteria, or altogether), without wasting money and never compromising on quality.

[1]  Michael Weiss,et al.  Crowdsourcing Literature Reviews in New Domains , 2016 .

[2]  A B Haidich,et al.  Meta-analysis in medical research. , 2010, Hippokratia.

[3]  Javier R. Movellan,et al.  Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise , 2009, NIPS.

[4]  Philippe Ravaud,et al.  Wasted research when systematic reviews fail to provide a complete and up-to-date evidence synthesis: the example of lung cancer , 2016, BMC Medicine.

[5]  Jinwoo Shin,et al.  Optimality of Belief Propagation for Crowdsourced Classification , 2016, ICML.

[6]  A. Webster,et al.  How to write a Cochrane systematic review , 2010, Nephrology.

[7]  John C. Platt,et al.  Learning from the Wisdom of Crowds by Minimax Entropy , 2012, NIPS.

[8]  Fabio Casati,et al.  Crowdsourcing Paper Screening in Systematic Literature Reviews , 2017, HCOMP.

[9]  Devavrat Shah,et al.  Iterative Learning for Reliable Crowdsourcing Systems , 2011, NIPS.

[10]  Divesh Srivastava,et al.  Data Fusion: Resolving Conflicts from Multiple Sources , 2013, WAIM.

[11]  A. P. Dawid,et al.  Maximum Likelihood Estimation of Observer Error‐Rates Using the EM Algorithm , 1979 .

[12]  J. Higgins,et al.  Cochrane Handbook for Systematic Reviews of Interventions, Version 5.1.0. The Cochrane Collaboration , 2013 .

[13]  Byron C. Wallace,et al.  An exploration of crowdsourcing citation screening for systematic reviews , 2017, Research synthesis methods.

[14]  K. Shojania,et al.  Systematic reviews can be produced and published faster. , 2008, Journal of clinical epidemiology.

[15]  Neil R. Smalheiser,et al.  Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach , 2017, J. Am. Medical Informatics Assoc..

[16]  G. Antes,et al.  Five Steps to Conducting a Systematic Review , 2003, Journal of the Royal Society of Medicine.

[17]  Hongwei Li,et al.  Error Rate Analysis of Labeling by Crowdsourcing , 2013 .

[18]  Chao Liu,et al.  TrueLabel + Confusions: A Spectrum of Probabilistic Models in Analyzing Multiple Ratings , 2012, ICML.

[19]  Qiang Liu,et al.  Scoring Workers in Crowdsourcing: How Many Control Questions are Enough? , 2013, NIPS.

[20]  Fabio Casati,et al.  Crowd-based Multi-Predicate Screening of Papers in Literature Reviews , 2018, WWW.

[21]  Matthew Lease,et al.  Crowdsourcing Information Extraction for Biomedical Systematic Reviews , 2016, AAAI 2016.

[22]  Maria J Grant,et al.  A typology of reviews: an analysis of 14 review types and associated methodologies. , 2009, Health information and libraries journal.