Abstract Background: At a time when research output is expanding exponentially, citizen science, the process of engaging willing volunteers in scientific research activities, has an important role to play in helping to manage the information overload. It also creates a model of contribution that enables anyone with an interest in health to contribute meaningfully and in a way that is flexible. Citizen science models have been shown to be extremely effective in other domains such as astronomy and ecology. Methods: Cochrane Crowd (crowd.cochrane.org) is a citizen science platform that offers contributors a range of microtasks, designed to help identify and describe health research. The platform enables contributors to dive into needed tasks that capture and describe health evidence. Brief interactive training modules and agreement algorithms help to ensure accurate collective decision making. Contributors can work online or offline; they can view their activity and performance in detail. They can choose to work in topic areas of interest to them such dementia or diabetes, and as contributors progress, they unlock milestone rewards and new tasks. Cochrane Crowd was launched in May 2016. It now hosts a range of microtasks which help to identify health evidence and then describe it according to a PICO (Population; Intervention; Comparator; Outcome) ontology. The microtasks are either at ‘citation level’ in which a contributor is presented with a title and abstract to classify or annotate, or at the full‐text level in which a whole or a portion of a full paper is displayed. Results: To date (March 2019), the Cochrane Crowd community comprises over 12,000 contributors from more than 180 countries. Almost 3 million individual classifications have been made, and around 70,000 reports of randomised trials have been identified for Cochrane's Central Register of Controlled Trials. Performance evaluations to assess crowd accuracy have shown crowd sensitivity is 99.1%, and crowd specificity is 99%. Main motivations for involvement are that people want to help Cochrane, and people want to learn. Conclusion: This model of contribution is now an established part of Cochrane's effort to manage the deluge of information produced in a way that offers contributors a chance to get involved, learn and play a crucial role in evidence production. Our experience has shown that people want to be involved and that, with little or no prior experience, can do certain tasks to a very high degree of collective accuracy. Using a citizen science approach effectively has enabled Cochrane to better support its expert community through better use of human effort. It has also generated large, high‐quality data sets on a scale not carried out before which has provided training material for machine learning routines. Citizen science is not an easy option, but performed well it brings a wealth of advantages to both the citizen and the organisation.
[1]
Byron C. Wallace,et al.
Living systematic reviews: 2. Combining human and machine effort.
,
2017,
Journal of clinical epidemiology.
[2]
Andrew W. Brown,et al.
Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry
,
2017,
BMJ Open.
[3]
Byron C. Wallace,et al.
Machine learning for identifying Randomized Controlled Trials: An evaluation and practitioner's guide
,
2018,
Research synthesis methods.
[4]
Lutz Bornmann,et al.
Growth rates of modern science: A bibliometric analysis based on the number of publications and cited references
,
2014,
J. Assoc. Inf. Sci. Technol..
[5]
Neil R. Smalheiser,et al.
Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach
,
2017,
J. Am. Medical Informatics Assoc..
[6]
F. Bekun,et al.
Crowd-sourcing (who, why and what)
,
2018
.
[7]
Daren C. Brabham.
Crowdsourcing as a Model for Problem Solving
,
2008
.