Pricing mechanisms for crowdsourcing markets

Every day millions of crowdsourcing tasks are performed in exchange for payments. Despite the important role pricing plays in crowdsourcing campaigns and the complexity of the market, most platforms do not provide requesters appropriate tools for effective pricing and allocation of tasks. In this paper, we introduce a framework for designing mechanisms with provable guarantees in crowdsourcing markets. The framework enables automating the process of pricing and allocation of tasks for requesters in complex markets like Amazon's Mechanical Turk where workers arrive in an online fashion and requesters face budget constraints and task completion deadlines. We present constant-competitive incentive compatible mechanisms for maximizing the number of tasks under a budget, and for minimizing payments given a fixed number of tasks to complete. To demonstrate the effectiveness of this framework we created a platform that enables applying pricing mechanisms in markets like Mechanical Turk. The platform allows us to show that the mechanisms we present here work well in practice, as well as to give experimental evidence to workers' strategic behavior in absence of appropriate incentive schemes.

[1]  Ning Chen,et al.  On the approximability of budget feasible mechanisms , 2010, SODA '11.

[2]  Gerardo Hermosillo,et al.  Learning From Crowds , 2010, J. Mach. Learn. Res..

[3]  Luis von Ahn Games with a Purpose , 2006, Computer.

[4]  Eric J. Friedman,et al.  Pricing WiFi at Starbucks: issues in online mechanism design , 2003, EC '03.

[5]  Stratis Ioannidis,et al.  Privacy Auctions for Recommender Systems , 2014, TEAC.

[6]  Mohammad Taghi Hajiaghayi,et al.  Online auctions with re-usable goods , 2005, EC '05.

[7]  Pietro Perona,et al.  The Multidimensional Wisdom of Crowds , 2010, NIPS.

[8]  Aniket Kittur,et al.  Crowdsourcing user studies with Mechanical Turk , 2008, CHI.

[9]  Yaron Singer,et al.  Budget Feasible Mechanisms , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[10]  Ning Chen,et al.  Budget Feasible Mechanism Design via Random Sampling , 2011, ArXiv.

[11]  Mohammad Taghi Hajiaghayi,et al.  Adaptive limited-supply online auctions , 2004, EC '04.

[12]  Hao Lü,et al.  Analysis of social gameplay macros in the Foldit cookbook , 2011, FDG.

[13]  J. Horton Algorithmic Wage Negotiations: Applications to Paid Crowdsourcing , 2010 .

[14]  Panagiotis G. Ipeirotis,et al.  Get another label? improving data quality and data mining using multiple, noisy labelers , 2008, KDD.

[15]  Devavrat Shah,et al.  Iterative Learning for Reliable Crowdsourcing Systems , 2011, NIPS.

[16]  Manu Sridharan,et al.  Predicting your own effort , 2012, AAMAS.

[17]  Lydia B. Chilton,et al.  The labor economics of paid crowdsourcing , 2010, EC '10.

[18]  Nicole Immorlica,et al.  A Knapsack Secretary Problem with Applications , 2007, APPROX-RANDOM.

[19]  David C. Parkes,et al.  Designing incentives for online question and answer forums , 2009, EC '09.

[20]  Chien-Ju Ho,et al.  Online Task Assignment in Crowdsourcing Markets , 2012, AAAI.

[21]  Christos H. Papadimitriou,et al.  Mechanisms for complement-free procurement , 2011, EC '11.

[22]  Yaron Singer,et al.  How to win friends and influence people, truthfully: influence maximization mechanisms for social networks , 2012, WSDM '12.

[23]  Javier R. Movellan,et al.  Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise , 2009, NIPS.

[24]  Robert D. Kleinberg,et al.  Learning on a budget: posted price mechanisms for online procurement , 2012, EC '12.

[25]  Éva Tardos,et al.  Frugal path mechanisms , 2002, SODA '02.

[26]  Nicholas R. Jennings,et al.  Efficient Crowdsourcing of Unknown Experts using Multi-Armed Bandits , 2012, ECAI.

[27]  Nicole Immorlica,et al.  Online auctions and generalized secretary problems , 2008, SECO.