ReTool: Interactive Microtask and Workflow Design through Demonstration

In addition to simple form filling, there is an increasing need for crowdsourcing workers to perform freeform interactions directly on content in microtask crowdsourcing (e.g. proofreading articles or specifying object boundary in an image). Such microtasks are often organized within well-designed workflows to optimize task quality and workload distribution. However, designing and implementing the interface and workflow for such microtasks is challenging because it typically requires programming knowledge and tedious manual effort. We present ReTool, a web-based tool for requesters to design and publish interactive microtasks and workflows by demonstrating the microtasks for text and image content. We evaluated ReTool against a task-design tool from a popular crowdsourcing platform and showed the advantages of ReTool over the existing approach.

[1]  Walter S. Lasecki,et al.  Glance: enabling rapid interactions with data using the crowd , 2014, CHI Extended Abstracts.

[2]  Lydia B. Chilton,et al.  TurKit: human computation algorithms on mechanical turk , 2010, UIST.

[3]  Michael S. Bernstein,et al.  Soylent: a word processor with a crowd inside , 2010, UIST.

[4]  Michael S. Bernstein,et al.  Crowds in two seconds: enabling realtime crowd-powered interfaces , 2011, UIST.

[5]  Daniel C. Halbert,et al.  Programming by Example , 2010, Encyclopedia of Machine Learning.

[6]  Krzysztof Z. Gajos,et al.  Crowdsourcing step-by-step information extraction to enhance existing how-to videos , 2014, CHI.

[7]  Tim Kraska,et al.  CrowdER: Crowdsourcing Entity Resolution , 2012, Proc. VLDB Endow..

[8]  James R. Lewis,et al.  IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use , 1995, Int. J. Hum. Comput. Interact..

[9]  Jon Froehlich,et al.  Combining crowdsourcing and google street view to identify street-level accessibility problems , 2013, CHI.

[10]  Aniket Kittur,et al.  CrowdWeaver: visually managing complex crowd work , 2012, CSCW.

[11]  Pietro Perona,et al.  Visual Recognition with Humans in the Loop , 2010, ECCV.

[12]  Aniket Kittur,et al.  Crowd synthesis: extracting categories and clusters from complex data , 2014, CSCW.

[13]  Krzysztof Z. Gajos,et al.  Platemate: crowdsourcing nutritional analysis from food photographs , 2011, UIST.

[14]  Lydia B. Chilton,et al.  Cascade: crowdsourcing taxonomy creation , 2013, CHI.

[15]  Michael S. Bernstein,et al.  Scalable multi-label annotation , 2014, CHI.

[16]  Aniket Kittur,et al.  CrowdForge: crowdsourcing complex work , 2011, UIST.

[17]  Björn Hartmann,et al.  Collaboratively crowdsourcing workflows with turkomatic , 2012, CSCW.

[18]  Hao Su,et al.  Crowdsourcing Annotations for Visual Object Detection , 2012, HCOMP@AAAI.

[19]  Rob Miller,et al.  VizWiz: nearly real-time answers to visual questions , 2010, UIST.