A Misreport- and Collusion-Proof Crowdsourcing Mechanism Without Quality Verification

Quality control plays a critical role in crowdsourcing. The state-of-the-art work is not suitable for large-scale crowdsourcing applications, since it is a long haul for the requestor to verify task quality or select professional workers in a one-by-one mode. In this paper, we propose a misreport- and collusion-proof crowdsourcing mechanism, guiding workers to truthfully report the quality of submitted tasks without collusion by designing a mechanism, so that workers have to act the way the requestor would like. In detail, the mechanism proposed by the requester makes no room for the workers to obtain profit through quality misreport and collusion, and thus, the quality can be controlled without any verification. Extensive simulation results verify the effectiveness of the proposed mechanism. Finally, the importance and originality of our work lie in that it reveals some interesting and even counterintuitive findings: 1) a high-quality worker may pretend to be a low-quality one; 2) the rise of task quality from high-quality workers may not result in the increased utility of the requestor; 3) the utility of the requestor may not get improved with the increasing number of workers. These findings can boost forward looking and strategic planning solutions for crowdsourcing.

[1]  Anna Cinzia Squicciarini,et al.  Dynamic Contract Design for Heterogenous Workers in Crowdsourcing for Quality Control , 2017, 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS).

[2]  Javier R. Movellan,et al.  Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise , 2009, NIPS.

[3]  Ye Tian,et al.  Revealing, characterizing, and detecting crowdsourcing spammers: A case study in community Q&A , 2015, 2015 IEEE Conference on Computer Communications (INFOCOM).

[4]  Jiannong Cao,et al.  High quality participant recruitment in vehicle-based crowdsourcing using predictable mobility , 2015, 2015 IEEE Conference on Computer Communications (INFOCOM).

[5]  Yi Wang,et al.  Photo crowdsourcing for area coverage in resource constrained environments , 2017, IEEE INFOCOM 2017 - IEEE Conference on Computer Communications.

[6]  Lukas Biewald,et al.  Programmatic Gold: Targeted and Scalable Quality Assurance in Crowdsourcing , 2011, Human Computation.

[7]  Fan Wu,et al.  Data Quality Guided Incentive Mechanism Design for Crowdsensing , 2018, IEEE Transactions on Mobile Computing.

[8]  Shuguang Han,et al.  Crowdsourcing Human Annotation on Web Page Structure , 2016, ACM Trans. Intell. Syst. Technol..

[9]  Dongwoo Kim,et al.  Leveraging Side Information to Improve Label Quality Control in Crowd-Sourcing , 2017, HCOMP.

[10]  Marco Ajmone Marsan,et al.  The importance of being earnest in crowdsourcing systems , 2015, 2015 IEEE Conference on Computer Communications (INFOCOM).

[11]  Tilman Börgers,et al.  An introduction to the theory of mechanism design , 2015 .

[12]  Xiuzhen Cheng,et al.  Quality Control in Crowdsourcing Using Sequential Zero-Determinant Strategies , 2020, IEEE Transactions on Knowledge and Data Engineering.