Spatial Crowdsourcing Task Assignment Based on the Quality of Workers

With the rapid development of mobile Internet, a variety of spatial crowdsourcing platforms have emerged and been widely applied. Task assignment is the core issue of spatial crowdsourcing. The existing methods of task assignment aim at assigning tasks to workers as much as possible, which lacks the guarantee of the quality of the tasks' answer. In this paper, two kinds of task assignment strategy based on the quality of workers are proposed to ensure the accuracy of the answer submitted by the workers as high as possible. The classical quality control algorithm, Incremental Quality Inference, is used to obtain the quality of workers. Capable worker strategy and maximum worker distance-quality strategy are proposed and compared with nearest work strategy to carry out task assignment based on the quality of workers computed by Incremental Quality Inference. Experimental results with discounted data in the offline shopping mall from crowdsourcing platform demonstrate the effectiveness of our approach.

[1]  Mikolaj Morzy,et al.  Prediction of Moving Object Location Based on Frequent Trajectories , 2006, ISCIS.

[2]  Ugur Demiryurek,et al.  Maximizing the number of worker's self-selected tasks in spatial crowdsourcing , 2013, SIGSPATIAL/GIS.

[3]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[4]  Lei Chen,et al.  GeoTruCrowd: trustworthy query answering with spatial crowdsourcing , 2013, SIGSPATIAL/GIS.

[5]  Qing Liu,et al.  A Hybrid Prediction Model for Moving Objects , 2008, 2008 IEEE 24th International Conference on Data Engineering.

[6]  Panagiotis G. Ipeirotis,et al.  Quality management on Amazon Mechanical Turk , 2010, HCOMP '10.

[7]  George Kesidis,et al.  Multicategory Crowdsourcing Accounting for Variable Task Difficulty, Worker Skill, and Worker Intention , 2015, IEEE Transactions on Knowledge and Data Engineering.

[8]  Javier R. Movellan,et al.  Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise , 2009, NIPS.

[9]  Hung Dang,et al.  Maximum Complex Task Assignment: Towards Tasks Correlation in Spatial Crowdsourcing , 2013, IIWAS '13.

[10]  Cyrus Shahabi,et al.  A Server-Assigned Spatial Crowdsourcing Framework , 2015, ACM Trans. Spatial Algorithms Syst..

[11]  Lei Chen,et al.  Whom to Ask? Jury Selection for Decision Making Tasks on Micro-blog Services , 2012, Proc. VLDB Endow..

[12]  A. P. Dawid,et al.  Maximum Likelihood Estimation of Observer Error‐Rates Using the EM Algorithm , 1979 .

[13]  Guoliang Li,et al.  Incremental Quality Inference in Crowdsourcing , 2014, DASFAA.

[14]  Wei Xu,et al.  An Optimization Framework for Online Ride-Sharing Markets , 2016, 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS).

[15]  Qing Wang,et al.  Quality-Assure and Budget-Aware Task Assignment for Spatial Crowdsourcing , 2016, CollaborateCom.

[16]  David R. Karger,et al.  Human-powered Sorts and Joins , 2011, Proc. VLDB Endow..

[17]  Cyrus Shahabi,et al.  GeoCrowd: enabling query answering with spatial crowdsourcing , 2012, SIGSPATIAL/GIS.