Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR)

Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The ‘Vienna Principles’ set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.

[1]  J. Higgins Cochrane handbook for systematic reviews of interventions. Version 5.1.0 [updated March 2011]. The Cochrane Collaboration , 2011 .

[2]  Lisa Hartling,et al.  Advancing knowledge of rapid reviews: an analysis of results, conclusions and recommendations from published review articles examining rapid reviews , 2015, Systematic Reviews.

[3]  Enrico Coiera,et al.  The automation of systematic reviews , 2013, BMJ.

[4]  Alexander Tsertsvadze,et al.  How to conduct systematic reviews more expeditiously? , 2015, Systematic Reviews.

[5]  Karin Stenström,et al.  Can abstract screening workload be reduced using text mining? User experiences of the tool Rayyan , 2017, Research synthesis methods.

[6]  D. Moher,et al.  Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. , 2010, International journal of surgery.

[7]  Byron C. Wallace,et al.  RobotReviewer: evaluation of a system for automatically assessing bias in clinical trials , 2015, J. Am. Medical Informatics Assoc..

[8]  P. Glasziou,et al.  Systematic review automation technologies , 2014, Systematic Reviews.

[9]  William C. Paganelli,et al.  Assessing surgical site infection risk factors using electronic medical records and text mining. , 2014, American journal of infection control.

[10]  H. Bastian,et al.  Seventy-Five Trials and Eleven Systematic Reviews a Day: How Will We Ever Keep Up? , 2010, PLoS medicine.

[11]  Jinqiu Yuan,et al.  Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool , 2015, npj Primary Care Respiratory Medicine.

[12]  Alaa M Althubaiti,et al.  Information bias in health research: definition, pitfalls, and adjustment methods , 2016, Journal of multidisciplinary healthcare.

[13]  F. Gómez-García,et al.  Most systematic reviews of high methodological quality on psoriasis interventions are classified as high risk of bias using ROBIS tool. , 2017, Journal of clinical epidemiology.

[14]  P. Glasziou,et al.  Are systematic reviews up-to-date at the time of publication? , 2013, Systematic Reviews.

[15]  Paul Glasziou,et al.  Better duplicate detection for systematic reviewers: evaluation of Systematic Review Assistant-Deduplication Module , 2015, Systematic Reviews.

[16]  S. Ananiadou,et al.  Using text mining for study identification in systematic reviews: a systematic review of current approaches , 2015, Systematic Reviews.

[17]  Carole A. Goble,et al.  Software Design for Empowering Scientists , 2009, IEEE Software.

[18]  Pearl Brereton,et al.  Tools to Support Systematic Literature Reviews in Software Engineering: A Mapping Study , 2013, 2013 ACM / IEEE International Symposium on Empirical Software Engineering and Measurement.

[19]  H. Arksey,et al.  Scoping studies: towards a methodological framework , 2005 .

[20]  N. Epstein,et al.  Multidisciplinary in-hospital teams improve patient outcomes: A review , 2014, Surgical neurology international.

[21]  Alistair Cockburn,et al.  Using Both Incremental and Iterative Development , 2008 .