CrowdHub: Extending crowdsourcing platforms for the controlled evaluation of tasks designs

We present CrowdHub, a tool for running systematic evaluations of task designs on top of crowdsourcing platforms. The goal is to support the evaluation process, avoiding potential experimental biases that, according to our empirical studies, can amount to 38% loss in the utility of the collected dataset in uncontrolled settings. Using CrowdHub, researchers can map their experimental design and automate the complex process of managing task execution over time while controlling for returning workers and crowd demographics, thus reducing bias, increasing utility of collected data, and making more efficient use of a limited pool of subjects.

[1]  Ricardo Kawase,et al.  Improving Reliability of Crowdsourced Results by Detecting Crowd Workers with Multiple Identities , 2017, ICWE.

[2]  Gianluca Demartini,et al.  Scaling-Up the Crowd: Micro-Task Pricing Schemes for Worker Retention and Latency Improvement , 2014, HCOMP.

[3]  Alessandro Bozzon,et al.  Clarity is a Worthwhile Quality: On the Role of Task Clarity in Microtask Crowdsourcing , 2017, HT.

[4]  Boi Faltings,et al.  Incentives to Counter Bias in Human Computation , 2014, HCOMP.

[5]  Marco Basaldella,et al.  Crowdsourcing Relevance Assessments: The Unexpected Benefits of Limiting the Time to Judge , 2016, HCOMP.

[6]  Bipin Indurkhya,et al.  Cognitively inspired task design to improve user performance on crowdsourcing platforms , 2014, CHI.

[7]  Lydia B. Chilton,et al.  MicroTalk: Using Argumentation to Improve Crowdsourcing Accuracy , 2016, HCOMP.

[8]  Jennifer Widom,et al.  Understanding Workers, Developing Effective Tasks, and Enhancing Marketplace Dynamics: A Study of a Large Crowdsourcing Marketplace , 2017, Proc. VLDB Endow..

[9]  Eric Horvitz,et al.  Identifying and Accounting for Task-Dependent Bias in Crowdsourcing , 2015, HCOMP.

[10]  Neha Gupta,et al.  Modus Operandi of Crowd Workers , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[11]  Carsten Eickhoff,et al.  Cognitive Biases in Crowdsourcing , 2018, WSDM.

[12]  Fabio Casati,et al.  Understanding the Impact of Text Highlighting in Crowdsourcing Tasks , 2019, HCOMP.