Becoming the Super Turker: Increasing Wages via a Strategy from High Earning Workers

Crowd markets have traditionally limited workers by not providing transparency information concerning which tasks pay fairly or which requesters are unreliable. Researchers believe that a key reason why crowd workers earn low wages is due to this lack of transparency. As a result, tools have been developed to provide more transparency within crowd markets to help workers. However, while most workers use these tools, they still earn less than minimum wage. We argue that the missing element is guidance on how to use transparency information. In this paper, we explore how novice workers can improve their earnings by following the transparency criteria of Super Turkers, i.e., crowd workers who earn higher salaries on Amazon Mechanical Turk (MTurk). We believe that Super Turkers have developed effective processes for using transparency information. Therefore, by having novices follow a Super Turker criteria (one that is simple and popular among Super Turkers), we can help novices increase their wages. For this purpose, we: (i) conducted a survey and data analysis to computationally identify a simple yet common criteria that Super Turkers use for handling transparency tools; (ii) deployed a two-week field experiment with novices who followed this Super Turker criteria to find better work on MTurk. Novices in our study viewed over 25,000 tasks by 1,394 requesters. We found that novices who utilized this Super Turkers' criteria earned better wages than other novices. Our results highlight that tool development to support crowd workers should be paired with educational opportunities that teach workers how to effectively use the tools and their related metrics (e.g., transparency values). We finish with design recommendations for empowering crowd workers to earn higher salaries.

[1]  M. Strathern The Tyranny of Transparency , 2000 .

[2]  B. Bergvall-Kåreborn,et al.  Amazon Mechanical Turk and the Commodification of Labour , 2014 .

[3]  C. Martin 2015 , 2015, Les 25 ans de l’OMC: Une rétrospective en photos.

[4]  David Lazer,et al.  Volunteer Science , 2016 .

[5]  Daniel J. Veit,et al.  More than fun and money. Worker Motivation in Crowdsourcing - A Study on Mechanical Turk , 2011, AMCIS.

[6]  A Blase,et al.  Retrospective descriptive study of CPAP adherence associated with use of the ResMed myAir application , 2015 .

[7]  Tetsunori Kobayashi,et al.  TurkScanner: Predicting the Hourly Wage of Microtasks , 2019, WWW.

[8]  Marko Bohanec,et al.  DECISION MAKING: A COMPUTER-SCIENCE AND INFORMATION-TECHNOLOGY VIEWPOINT , 2009 .

[9]  Bill Tomlinson,et al.  Who are the crowdworkers?: shifting demographics in mechanical turk , 2010, CHI Extended Abstracts.

[10]  Lei Han,et al.  All Those Wasted Hours: On Task Abandonment in Crowdsourcing , 2019, WSDM.

[11]  Jeffrey P. Bigham,et al.  Becoming the Super Turker:Increasing Wages via a Strategy from High Earning Workers , 2020, WWW.

[12]  M. F. Luce,et al.  When time is money : Decision behavior under opportunity-cost time pressure , 1996 .

[13]  Bill Tomlinson,et al.  Responsible research with crowds , 2018, Commun. ACM.

[14]  Elizabeth Gerber,et al.  Combining crowdsourcing and learning to improve engagement and performance , 2014, CHI.

[15]  B. J. Fogg,et al.  A behavior model for persuasive design , 2009, Persuasive '09.

[16]  M. Six Silberman,et al.  Stories We Tell About Labor: Turkopticon and the Trouble with "Design" , 2016, CHI.

[17]  Lizbeth Escobedo,et al.  MOSOCO: a mobile assistive tool to support children with autism practicing social skills in real-life situations , 2012, CHI.

[18]  Janine Berg,et al.  Income Security in the On-Demand Economy: Findings and Policy Lessons from a Survey of Crowdworkers , 2016 .

[19]  Chris Callison-Burch,et al.  Crowd-Workers: Aggregating Information Across Turkers to Help Them Find Higher Paying Work , 2014, HCOMP.

[20]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[21]  Chris Callison-Burch,et al.  A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk , 2017, CHI.

[22]  Claudia López,et al.  Behind the Myths of Citizen Participation , 2017, ACM Trans. Internet Techn..

[23]  Meg Young,et al.  Defining AI in Policy versus Practice , 2019, AIES.

[24]  Edward Lank,et al.  The Perpetual Work Life of Crowdworkers , 2019, Proc. ACM Hum. Comput. Interact..

[25]  Ting-Hao Huang,et al.  Evorus: A Crowd-powered Conversational Assistant Built to Automate Itself Over Time , 2018, CHI.

[26]  D. Dimitrov,et al.  Pretest-posttest designs and measurement of change. , 2003, Work.

[27]  Saiph Savage,et al.  Exploring Blockchain for Trustful Collaborations between Immigrants and Governments , 2018, CHI Extended Abstracts.

[28]  Airi Lampinen,et al.  Hosting via Airbnb: Motivations and Financial Assurances in Monetized Network Hospitality , 2016, CHI.

[29]  Elise Lavoué,et al.  Design for Teaching and Learning in a Networked World: 10th European Conference on Technology Enhanced Learning, EC-TEL 2015, Toledo, Spain, September 15-18, 2015, Proceedings , 2015, EC-TEL.

[30]  Saiph Savage,et al.  Crowd Coach , 2018, Proc. ACM Hum. Comput. Interact..

[31]  Tao Jiang,et al.  On the Approximation of Shortest Common Supersequences and Longest Common Subsequences , 1994, SIAM J. Comput..

[32]  Michael S. Bernstein,et al.  Atelier: Repurposing Expert Crowdsourcing Tasks as Micro-internships , 2016, CHI.

[33]  J. Stiglitz The Contributions of the Economics of Information to Twentieth Century Economics , 2000 .

[34]  Michael S. Bernstein,et al.  Fair Work: Crowd Work Minimum Wage with One Line of Code , 2019, HCOMP.

[35]  Dan Cosley,et al.  Privacy, Power, and Invisible Labor on Amazon Mechanical Turk , 2019, CHI.

[36]  Eric Horvitz,et al.  Toward a Learning Science for Complex Crowdsourcing Tasks , 2016, CHI.

[37]  Rob Miller,et al.  VizWiz: nearly real-time answers to visual questions , 2010, UIST.

[38]  Kwong-Sak Leung,et al.  Task recommendation in crowdsourcing systems , 2012, CrowdKDD '12.

[39]  Mary L. Gray,et al.  Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass , 2019 .

[40]  Heng Li,et al.  Analytic hierarchy process (AHP) , 2002 .

[41]  Ernest H. Forman,et al.  Multi Criteria Decision Making and the Analytic Hierarchy Process , 1990 .

[42]  Anna L. Cox,et al.  Monotasking or Multitasking: Designing for Crowdworkers' Preferences , 2019, CHI.

[43]  Adam M Persky,et al.  Moving from Novice to Expertise and Its Implications for Instruction , 2017, American Journal of Pharmaceutical Education.

[44]  Jeffrey P. Bigham,et al.  Striving to Earn More: A Survey of Work Strategies and Tool Use Among Crowd Workers , 2018, HCOMP.

[45]  Florence March,et al.  2016 , 2016, Affair of the Heart.

[46]  Valerio De Stefano The rise of the "just-in-time workforce": on-demand work, crowdwork and labour protection in the "gig-economy" , 2015 .

[47]  Sushil Kumar,et al.  Analytic hierarchy process: An overview of applications , 2006, Eur. J. Oper. Res..

[48]  E. Tronci,et al.  1996 , 1997, Affair of the Heart.

[49]  Ellie Harmon,et al.  Digital labour platforms and the future of work , 2018 .

[50]  Saiph Savage,et al.  Crowd Work on a CV? Understanding How AMT Fits into Turkers' Career Goals and Professional Profiles , 2019, ArXiv.

[51]  Lydia B. Chilton,et al.  The labor economics of paid crowdsourcing , 2010, EC '10.

[52]  Louis-Philippe Morency,et al.  Toward crowdsourcing micro-level behavior annotations: the challenges of interface, training, and generalization , 2014, IUI.

[53]  M. Six Silberman,et al.  Rating Working Conditions on Digital Labor Platforms , 2018, Computer Supported Cooperative Work (CSCW).

[54]  Vili Lehdonvirta,et al.  Online Labour Index: Measuring the Online Gig Economy for Policy and Research , 2018, Technological Forecasting and Social Change.

[55]  Dan Cosley,et al.  Taking a HIT: Designing around Rejection, Mistrust, Risk, and Workers' Experiences in Amazon Mechanical Turk , 2016, CHI.

[56]  John Joseph Horton,et al.  The Condition of the Turking Class: Are Online Employers Fair and Honest? , 2010, ArXiv.

[57]  Panagiotis G. Ipeirotis,et al.  The Global Opportunity in Online Outsourcing , 2015 .

[58]  Neha Gupta,et al.  Modus Operandi of Crowd Workers , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[59]  Matthew Lease,et al.  Beyond Mechanical Turk: An Analysis of Paid Crowd Work Platforms , 2015 .

[60]  Tobias Höllerer,et al.  Botivist: Calling Volunteers to Action using Online Bots , 2015, CSCW.

[61]  Tao Jiang,et al.  On the Approximation of Shortest Common Supersequences and Longest Common Subsequences , 1995, SIAM J. Comput..

[62]  Mohammad Hossein Jarrahi,et al.  Algorithmic Management and Algorithmic Competencies: Understanding and Appropriating Algorithms in Gig Work , 2019, iConference.

[63]  Krista Casler,et al.  Separate but equal? A comparison of participants and data gathered via Amazon's MTurk, social media, and face-to-face behavioral testing , 2013, Comput. Hum. Behav..

[64]  Steve Sawyer,et al.  Platformic Management, Boundary Resources for Gig Work, and Worker Autonomy , 2019, Computer Supported Cooperative Work (CSCW).

[65]  Scott R. Klemmer,et al.  Shepherding the crowd: managing and providing feedback to crowd workers , 2011, CHI Extended Abstracts.

[66]  Gianluca Demartini,et al.  Understanding Worker Moods and Reactions to Rejection in Crowdsourcing , 2019, HT.

[67]  Shih-Ching Yeh,et al.  Effects of Unidirectional vs. Reciprocal Teaching Strategies on Web-Based Computer Programming Learning , 2014 .

[68]  Carlos A. Bana e Costa,et al.  Readings in Multiple Criteria Decision Aid , 2011 .

[69]  M. Six Silberman,et al.  Turkopticon: interrupting worker invisibility in amazon mechanical turk , 2013, CHI.

[70]  Jacki O'Neill,et al.  Being a turker , 2014, CSCW.

[71]  Michael S. Bernstein,et al.  We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers , 2015, CHI.

[72]  Carsten S. Østerlund,et al.  Coordinating Advanced Crowd Work: Extending Citizen Science , 2018, HICSS.