CrowdTerrier: automatic crowdsourced relevance assessments with terrier
暂无分享,去创建一个
In this demo, we present CrowdTerrier, an infrastructure extension to the open source Terrier IR platform that enables the semi-automatic generation of relevance assessments for a variety of document ranking tasks using crowdsourcing. The aim of CrowdTerrier is to reduce the time and expertise required to effectively Crowdsource relevance assessments by abstracting away from the complexities of the crowdsourcing process. It achieves this by automating the assessment process as much as possible, via a close integration of the IR system that ranks the documents (Terrier) and the crowdsourcing marketplace that is used to assess those documents (Amazon's Mechanical Turk).
[1] Craig MacDonald,et al. Identifying top news using crowdsourcing , 2012, Information Retrieval.
[2] Ben He,et al. Terrier : A High Performance and Scalable Information Retrieval Platform , 2022 .