Benchmarking biomedical text mining web servers at BioCreative V.5: the technical Interoperability and Performance of annotation Servers - TIPS track

The TIPS track consisted in a novel experimental t ask under the umbrella of the BioCreative text mining challenges with the aim to, for the first time ever, carry out a text mining challenge with partic ular focus on the continuous assessment of technical aspects of text annotation web servers, specifically of biomedical online named entity recognition systems. A total of 13 teams registered annotation servers, implemented in various programming languages, supporting up to 12 different g eral annotation types. The continuous evaluation period took place from Februa ry to March 2017. The systematic and continuous evaluation of server respons es accounted for testing periods of low activity and moderate to high activity . Moreover three document provider settings were covered, including also NCBI Pu bMed. For a total of 4,092,502 requests, the median response time for mo st servers was below 3.74 s with a median of 10 annotations/document. Most of t he servers showed great reliability and stability, being able to process 10 0,000 requests in 5 days.