Does Flight Path Context Matter? Impact on Worker Performance in Crowdsourced Aerial Imagery Analysis

Natural disasters result in billions of dollars in damages annually and communities left struggling with the difficult task of response and recovery. To this end, small private aircraft and drones have been deployed to gather images along flight paths over the affected areas, for analyzing aerial photography through crowdsourcing. However, due to the volume of raw data, the context and order of these images is often lost when reaching workers. In this work, we explored the effect of contextualizing a labeling task on Amazon Mechanical Turk, by serving workers images in the order they were collected on the flight and showing them the location of the current image on a map. We did not find a negative impact from the loss of contextual information, and found map context had a negative impact on worker performance. This may indicate that ordering images based on other criteria may be more effective.

[1]  Stefano Tranquillini,et al.  Keep it simple: reward and task design in crowdsourcing , 2013, CHItaly '13.

[2]  Bill Tomlinson,et al.  Who are the crowdworkers?: shifting demographics in mechanical turk , 2010, CHI Extended Abstracts.

[3]  Marjorie Greene,et al.  Crowdsourcing earthquake damage assessment using remote sensing imagery , 2012 .

[4]  C. Lintott,et al.  Galaxy Zoo: Exploring the Motivations of Citizen Science Volunteers. , 2009, 0909.2925.

[5]  Adam Marcus,et al.  The Effects of Sequence and Delay on Crowd Work , 2015, CHI.

[6]  Seth Cooper,et al.  On Variety, Complexity, and Engagement in Crowdsourced Disaster Response Tasks , 2017, ISCRAM.

[7]  Jeffrey Warren,et al.  Grassroots mapping : tools for participatory and activist cartography , 2010 .

[8]  Walter S. Lasecki,et al.  Using Microtask Continuity to Improve Crowdsourcing , 2014 .

[9]  Jeffrey P. Bigham,et al.  VizWiz: nearly real-time answers to visual questions , 2010, W4A.

[10]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[11]  Ferda Ofli,et al.  Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response , 2016, Big Data.

[12]  Panagiotis G. Ipeirotis,et al.  Running Experiments on Amazon Mechanical Turk , 2010, Judgment and Decision Making.

[13]  Peng Dai,et al.  And Now for Something Completely Different: Improving Crowdsourcing Workflows with Micro-Diversions , 2015, CSCW.

[14]  Yun-En Liu,et al.  Designing Engaging Games Using Bayesian Optimization , 2016, CHI.

[15]  Sophia B. Liu,et al.  Crisis Crowdsourcing Framework: Designing Strategic Configurations of Crowdsourcing for the Emergency Management Domain , 2014, Computer Supported Cooperative Work (CSCW).

[16]  Michael F. Goodchild,et al.  Please Scroll down for Article International Journal of Digital Earth Crowdsourcing Geographic Information for Disaster Response: a Research Frontier Crowdsourcing Geographic Information for Disaster Response: a Research Frontier , 2022 .

[17]  Jaime Teevan,et al.  Chain Reactions: The Impact of Order on Microtask Chains , 2016, CHI.

[18]  Aleksandrs Slivkins,et al.  Incentivizing high quality crowdwork , 2015, SECO.

[19]  Robert Munro,et al.  Quality analysis after action report for the crowdsourced aerial imagery assessment following hurricane sandy , 2013, ISCRAM.

[20]  Walter S. Lasecki,et al.  Answering visual questions with conversational crowd assistants , 2013, ASSETS.

[21]  Carlos Castillo,et al.  AIDR: artificial intelligence for disaster response , 2014, WWW.

[22]  Peng Dai,et al.  Decision-Theoretic Control of Crowd-Sourced Workflows , 2010, AAAI.

[23]  T. Peto,et al.  Crowdsourcing as a Novel Technique for Retinal Fundus Photography Classification: Analysis of Images in the EPIC Norfolk Cohort on Behalf of the UKBiobank Eye and Vision Consortium , 2013, PloS one.

[24]  Laura A. Dabbish,et al.  Labeling images with a computer game , 2004, AAAI Spring Symposium: Knowledge Collection from Volunteer Contributors.

[25]  Michael S. Bernstein,et al.  The future of crowd work , 2013, CSCW.

[26]  Joseph G. Davis,et al.  Crowdsourcing, cognitive load, and user interface design , 2013 .

[27]  Jeffrey Nichols,et al.  Chorus: a crowd-powered conversational assistant , 2013, UIST.

[28]  Michael Wimmer,et al.  Cropland Capture - A Game for Improving Global Cropland Maps , 2015, FDG.

[29]  Seth Cooper,et al.  A Required Work Payment Scheme for Crowdsourced Disaster Response: Worker Performance and Motivations , 2017, ISCRAM.