A Required Work Payment Scheme for Crowdsourced Disaster Response: Worker Performance and Motivations

Crowdsourcing is an increasingly popular approach for processing data in response to disasters. While volunteer crowdsourcing may suffice for high-profile disasters, paid crowdsourcing may be necessary to recruit workers for less prominent events. Thus, understanding the impact of payment schemes on worker behavior and motivation may improve outcomes. In this work, we presented workers recruited from Amazon Mechanical Turk with a disaster response task in which they could provide a variable number of image ratings. We paid workers a fixed amount to provide a minimum number of image ratings, allowing them to voluntarily provide more if desired; this allowed us to examine the impact of different amounts of required work. We found that requiring no ratings resulted in workers voluntary completing more work, and being more likely to indicate motivation related to interest on a post survey, than when small numbers of ratings were required. This is consistent with the motivational crowding-out effect, even in paid crowdsourcing. We additionally found that providing feedback on progress positively impacted the amount of work done.

[1]  H. Sauermann,et al.  Crowd science user contribution patterns and their implications , 2015, Proceedings of the National Academy of Sciences.

[2]  John Yen,et al.  Seeking the trustworthy tweet: Can microblogged data fit the information needs of disaster response and humanitarian relief organizations , 2011, ISCRAM.

[3]  Jeffrey Warren,et al.  Grassroots mapping : tools for participatory and activist cartography , 2010 .

[4]  T. Peto,et al.  Crowdsourcing as a Novel Technique for Retinal Fundus Photography Classification: Analysis of Images in the EPIC Norfolk Cohort on Behalf of the UKBiobank Eye and Vision Consortium , 2013, PloS one.

[5]  James A. Landay,et al.  Utility of human-computer interactions: toward a science of preference measurement , 2011, CHI.

[6]  Michael F. Goodchild,et al.  Please Scroll down for Article International Journal of Digital Earth Crowdsourcing Geographic Information for Disaster Response: a Research Frontier Crowdsourcing Geographic Information for Disaster Response: a Research Frontier , 2022 .

[7]  Bruno S. Frey,et al.  How Intrinsic Motivation is Crowded out and in , 1994 .

[8]  J. Fowler,et al.  Rapid assessment of disaster damage using social media activity , 2016, Science Advances.

[9]  John Crowley,et al.  Connecting Grassroots and Government for Disaster Response , 2013 .

[10]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[11]  Marco Steinhauser,et al.  Higher incentives can impair performance: neural evidence on reinforcement and rationality. , 2015, Social cognitive and affective neuroscience.

[12]  Daniel J. Veit,et al.  More than fun and money. Worker Motivation in Crowdsourcing - A Study on Mechanical Turk , 2011, AMCIS.

[13]  Jaime Teevan,et al.  Chain Reactions: The Impact of Order on Microtask Chains , 2016, CHI.

[14]  R. Nisbett,et al.  Undermining children's intrinsic interest with extrinsic reward: A test of the "overjustification" hypothesis. , 1973 .

[15]  Sophia B. Liu,et al.  Crisis Crowdsourcing Framework: Designing Strategic Configurations of Crowdsourcing for the Emergency Management Domain , 2014, Computer Supported Cooperative Work (CSCW).

[16]  E. Deci,et al.  Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions. , 2000, Contemporary educational psychology.

[17]  Devin G. Pope,et al.  What Motivates Effort? Evidence and Expert Forecasts , 2016 .

[18]  Arindam Ghosh,et al.  Motivational feedback in crowdsourcing: a case study in speech transcription , 2013, INTERSPEECH.

[19]  Carlos Castillo,et al.  AIDR: artificial intelligence for disaster response , 2014, WWW.

[20]  Richard,et al.  Motivation through the Design of Work: Test of a Theory. , 1976 .

[21]  Scott R. Klemmer,et al.  Shepherding the crowd yields better work , 2012, CSCW.

[22]  Yun-En Liu,et al.  Designing Engaging Games Using Bayesian Optimization , 2016, CHI.

[23]  Michael Wimmer,et al.  Cropland Capture - A Game for Improving Global Cropland Maps , 2015, FDG.

[24]  Aleksandrs Slivkins,et al.  Incentivizing high quality crowdwork , 2015, SECO.

[25]  Marjorie Greene,et al.  Crowdsourcing earthquake damage assessment using remote sensing imagery , 2012 .

[26]  C. Lintott,et al.  Galaxy Zoo: Exploring the Motivations of Citizen Science Volunteers. , 2009, 0909.2925.

[27]  Ferda Ofli,et al.  Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response , 2016, Big Data.

[28]  Dana Chandler,et al.  Labor Allocation in Paid Crowdsourcing: Experimental Evidence on Positioning, Nudges and Prices , 2011, Human Computation.

[29]  A. Rustichini,et al.  Pay Enough or Don't Pay at All , 2000 .

[30]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[31]  Laura A. Dabbish,et al.  Labeling images with a computer game , 2004, AAAI Spring Symposium: Knowledge Collection from Volunteer Contributors.

[32]  Clive Seligman,et al.  Affect and the overjustification effect. , 1984 .

[33]  B. Everitt,et al.  Statistical methods for rates and proportions , 1973 .

[34]  Kirill Kireyev Applications of Topics Models to Analysis of Disaster-Related Twitter Data , 2009 .

[35]  Jason T. Jacques,et al.  Crowdsourcing a HIT : Measuring Workers ’ Pre-Interactions on Microtask Markets , 2013 .

[36]  E. Deci Effects of Externally Mediated Rewards on Intrinsic Motivation. , 1971 .

[37]  Dana Chandler,et al.  Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets , 2012, ArXiv.

[38]  John P. Wilson,et al.  Conducting disaster damage assessments with Spatial Video, experts, and citizens , 2014 .

[39]  Robert Munro,et al.  Quality analysis after action report for the crowdsourced aerial imagery assessment following hurricane sandy , 2013, ISCRAM.