CrowdLabel: A crowdsourcing platform for electrophysiology

In foetal electrocardiographic monitoring, assessment of foetal QT (FQT) in identifying foetal hypoxia has been limited mainly due to the lack of available public databases with expert labels. Our proposed platform, CrowdLabel, a web-based open-source annotation system, was developed for crowdsourcing medical labels from multiple expert and/or non-expert annotators. We describe the platform and an example of use; to improve FQT estimation by creating reference labels against which automated algorithms can be benchmarked. A total of 501, 30s segments were extracted from 15 foetal ECG (FECG) recordings from a private database. 23 volunteers participated in the study and provided a total of 7,307 FQT annotations, which were aggregated using a probabilistic label aggregator (PLA). The best annotator identified by the PLA had a standard deviation in the change of FQT annotations of 13.35 ms and 35.52 ms when labelling FECG with `very good' and `poor' signal quality respectively. The PLA does not require any ground truth to identify the best annotator or annotations. Annotator accuracy was also shown to be a function of objective signal quality measures. The feasibility of the CrowdLabel annotation system for ECG crowdsourcing with an unknown ground truth, as well as the results of the first experiment conducted using such a platform have been demonstrated.