A decision support system and rule-based algorithm to augment the human interpretation of the 12-lead electrocardiogram.

BACKGROUND The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. OBJECTIVES To improve interpretation accuracy and reduce missed co-abnormalities. METHODS The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. RESULTS A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. CONCLUSION Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct interpretation more often than humans, and 3) as many as 7 computerised diagnostic suggestions augmented human decision making in ECG interpretation. Statistical significance may be achieved by expanding sample size.

[1]  Dewar D. Finlay,et al.  An annotation driven rule-based algorithm for suggesting multiple 12-lead ECG interpretations , 2016, 2016 Computing in Cardiology Conference (CinC).

[2]  R R Bond,et al.  Assessing computerized eye tracking technology for gaining insight into expert interpretation of the 12-lead electrocardiogram: an objective quantitative approach. , 2014, Journal of electrocardiology.

[3]  J. L. Willems,et al.  The diagnostic performance of computer programs for the interpretation of electrocardiograms. , 1992, The New England journal of medicine.

[4]  N. Estes,et al.  Computerized Interpretation of ECGs Supplement Not a Substitute , 2013 .

[5]  Raymond R. Bond,et al.  A computer-human interaction model to improve the diagnostic accuracy and clinical decision-making during 12-lead electrocardiogram interpretation , 2016, J. Biomed. Informatics.

[6]  S. Goodacre,et al.  Do computer generated ECG reports improve interpretation by accident and emergency senior house officers? , 2001, Postgraduate medical journal.

[7]  Raymond R. Bond,et al.  The role of computerized diagnostic proposals in the interpretation of the 12-lead electrocardiogram by cardiology and non-cardiology fellows , 2017, Int. J. Medical Informatics.

[8]  M. Dinh,et al.  The effect of clinical history on accuracy of electrocardiograph interpretation among doctors working in emergency departments , 2012, The Medical journal of Australia.

[9]  Daniel Guldenring,et al.  Data analysis of diagnostic accuracies in 12-lead electrocardiogram interpretation by junior medical fellows. , 2015, Journal of electrocardiology.

[10]  John E Madias,et al.  The 13th multiuse ECG lead: shouldn't we use it more often, and on the same hard copy or computer screen, as the other 12 leads? , 2004, Journal of electrocardiology.

[11]  R. R. Bond,et al.  Eye tracking in the assessment of electrocardiogram interpretation techniques , 2012, 2012 Computing in Cardiology.

[12]  Raymond Bond,et al.  An evaluation of eye tracking technology in the assessment of 12 lead electrocardiography interpretation. , 2014, Journal of electrocardiology.

[13]  Anique B. H. de Bruin,et al.  Checklists improve experts’ diagnostic decisions , 2013, Medical education.

[14]  Zlatko Fras,et al.  ESC core curriculum for the general cardiologist (2013). , 2013, European heart journal.

[15]  Douglas B. Fridsma,et al.  Research Paper: Computer Decision Support as a Source of Interpretation Error: The Case of Electrocardiograms , 2003, J. Am. Medical Informatics Assoc..

[16]  A. Shah,et al.  Errors in the computerized electrocardiogram interpretation of cardiac rhythm. , 2007, Journal of electrocardiology.

[17]  Mark R. Wilson,et al.  Exploring the Impact of Expertise, Clinical History, and Visual Search on Electrocardiogram Interpretation , 2014, Medical decision making : an international journal of the Society for Medical Decision Making.

[18]  Frank Bogun,et al.  Misdiagnosis of atrial fibrillation and its clinical consequences. , 2004, The American journal of medicine.

[19]  Daniel Guldenring,et al.  Using computerised interactive response technology to assess electrocardiographers and for aggregating diagnoses. , 2015, Journal of electrocardiology.