Fair Work: Crowd Work Minimum Wage with One Line of Code

Accurate task pricing in microtask marketplaces requires substantial effort via trial and error, contributing to a pattern of worker underpayment. In response, we introduce Fair Work, enabling requesters to automatically pay their workers minimum wage by adding a one-line script tag to their task HTML on Amazon Mechanical Turk. Fair Work automatically surveys workers to find out how long the task takes, then aggregates those self-reports and auto-bonuses workers up to a minimum wage if needed. Evaluations demonstrate that the system estimates payments more accurately than requesters and that worker time surveys are close to behaviorally observed time measurements. With this work, we aim to lower the threshold for pro-social work practices in microtask marketplaces.

[1]  Pamela J. Hinds,et al.  The curse of expertise: The effects of expertise and debiasing methods on prediction of novice performance. , 1999 .

[2]  Brad A. Myers,et al.  Past, Present and Future of User Interface Software Tools , 2000, TCHI.

[3]  D. Prelec A Bayesian Truth Serum for Subjective Data , 2004, Science.

[4]  Jeffrey T. Hancock,et al.  The truth about lying in online dating profiles , 2007, CHI.

[5]  Panagiotis G. Ipeirotis,et al.  Get another label? improving data quality and data mining using multiple, noisy labelers , 2008, KDD.

[6]  Panagiotis G. Ipeirotis Demographics of Mechanical Turk , 2010 .

[7]  Lydia B. Chilton,et al.  The labor economics of paid crowdsourcing , 2010, EC '10.

[8]  Lydia B. Chilton,et al.  Task search in a human computation market , 2010, HCOMP '10.

[9]  Aniket Kittur,et al.  Instrumenting the crowd: using implicit behavioral measures to predict task performance , 2011, UIST.

[10]  Michael S. Bernstein,et al.  Crowds in two seconds: enabling realtime crowd-powered interfaces , 2011, UIST.

[11]  Rob Miller,et al.  Real-time crowd control of existing interfaces , 2011, UIST.

[12]  Benjamin B. Bederson,et al.  Web workers unite! addressing challenges of online laborers , 2011, CHI Extended Abstracts.

[13]  Daniel J. Veit,et al.  More than fun and money. Worker Motivation in Crowdsourcing - A Study on Mechanical Turk , 2011, AMCIS.

[14]  Aaron D. Shaw,et al.  Social desirability bias and self-reports of motivation: a study of amazon mechanical turk in the US and India , 2012, CHI.

[15]  Michael S. Bernstein,et al.  The future of crowd work , 2013, CSCW.

[16]  M. Six Silberman,et al.  Turkopticon: interrupting worker invisibility in amazon mechanical turk , 2013, CHI.

[17]  Jacki O'Neill,et al.  Turk-Life in India , 2014, GROUP.

[18]  Jacki O'Neill,et al.  Being a turker , 2014, CSCW.

[19]  Chris Callison-Burch,et al.  Crowd-Workers: Aggregating Information Across Turkers to Help Them Find Higher Paying Work , 2014, HCOMP.

[20]  Michael S. Bernstein,et al.  We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers , 2015, CHI.

[21]  Michael S. Bernstein,et al.  Measuring Crowdsourcing Effort with Error-Time Curves , 2015, CHI.

[22]  David B. Martin,et al.  TurkBench: Rendering the Market for Turkers , 2015, CHI.

[23]  D. Rolf The Fight for Fifteen: The Right Wage for a Working America , 2015 .

[24]  M. Six Silberman,et al.  Stories We Tell About Labor: Turkopticon and the Trouble with "Design" , 2016, CHI.

[25]  Anne Marie Piper,et al.  "Why would anybody do this?": Understanding Older Adults' Motivations and Challenges in Crowd Work , 2016, CHI.

[26]  Mary L. Gray,et al.  The Crowd is a Collaborative Network , 2016, CSCW.

[27]  Dan Cosley,et al.  Taking a HIT: Designing around Rejection, Mistrust, Risk, and Workers' Experiences in Amazon Mechanical Turk , 2016, CHI.

[28]  Michael S. Bernstein,et al.  Examining Crowd Work and Gig Work Through the Historical Lens of Piecework , 2017, CHI.

[29]  Michael S. Bernstein,et al.  Crowd Guilds: Worker-led Reputation and Feedback on Crowdsourcing Platforms , 2016, CSCW.

[30]  Matthew Lease,et al.  Design Activism for Minimum Wage Crowd Work , 2017, ArXiv.

[31]  Michael S. Bernstein,et al.  Prototype Tasks: Improving Crowdsourcing Results through Rapid, Iterative Task Design , 2017, ArXiv.

[32]  Chris Callison-Burch,et al.  A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk , 2017, CHI.

[33]  Mausam,et al.  Sprout: Crowd-Powered Task Design for Crowdsourcing , 2018, UIST.

[34]  Jeffrey P. Bigham,et al.  Striving to Earn More: A Survey of Work Strategies and Tool Use Among Crowd Workers , 2018, HCOMP.

[35]  Mary L. Gray,et al.  Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass , 2019 .

[36]  Tetsunori Kobayashi,et al.  TurkScanner: Predicting the Hourly Wage of Microtasks , 2019, WWW.