Game Theoretic Analysis for Offense-Defense Challenges of Algorithm Contests on TopCoder

Software crowd sourcing platforms such as TopCoder have successfully adopted the offense-defense based quality assurance mechanism into software development process to deliver high-quality software solutions. TopCoder algorithm contests run single-round matches (SRM) with the challenge phase that allows participants to find bugs in the submitted program and eliminate their opponents. In the paper, we introduce a game theoretic model to study the competitive behaviors in the challenge phase of SRM. By analyzing the Nash Equilibrium of our multiple-person game model, we find that the probability of making a successful challenge and effort cost are the major factors for contestants' decisions. To verify the theoretic result, we make empirical data analysis with the dataset collected from the algorithm challenge phase on TopCoder. And the results indicate that contestants with a high rating are more likely to launch challenges against lower ones. However, contestants with the highest rating may be unwilling to challenge to avoid risks of losing their points in the contests.

[1]  Kevin Boudreau,et al.  Performance Responses To Competition Across Skill-Levels In Rank Order Tournaments: Field Evidence and Implications For Tournament Design , 2015 .

[2]  Daniel Fried,et al.  Crowdsourcing in the Software Development Industry , 2011 .

[3]  Yongji Wang,et al.  Analysis of the Key Factors for Software Quality in Crowdsourcing Development: An Empirical Study on TopCoder.com , 2013, 2013 IEEE 37th Annual Computer Software and Applications Conference.

[4]  Yuanyuan Zhang,et al.  App Store Analysis: Mining App Stores for Relationships between Customer, Business and Technical Characteristics , 2014 .

[5]  Lada A. Adamic,et al.  Crowdsourcing with all-pay auctions: A field experiment on Taskcn , 2011, ASIST.

[6]  Zhenghui Hu,et al.  A Game Theoretic Model of Software Crowdsourcing , 2014, 2014 IEEE 8th International Symposium on Service Oriented System Engineering.

[7]  Mark Harman,et al.  Pricing crowdsourcing-based software development tasks , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[8]  Arun Sundararajan,et al.  Optimal Design of Crowdsourcing Contests , 2009, ICIS.

[9]  A. Saenz-Otero,et al.  SPHERES Zero Robotics software development: Lessons on crowdsourcing and collaborative competition , 2012, 2012 IEEE Aerospace Conference.

[10]  Nikolay Archak,et al.  Money, glory and cheap talk: analyzing strategic behavior of contestants in simultaneous crowdsourcing contests on TopCoder.com , 2010, WWW '10.

[11]  Deepak Ganesan,et al.  TruCentive: A game-theoretic incentive platform for trustworthy mobile crowdsourcing parking services , 2012, 2012 15th International IEEE Conference on Intelligent Transportation Systems.

[12]  Wei-Tek Tsai,et al.  An evaluation framework for software crowdsourcing , 2013, Frontiers of Computer Science.

[13]  Milan Vojnovic,et al.  Crowdsourcing and all-pay auctions , 2009, EC '09.

[14]  Mihaela van der Schaar,et al.  Reputation-based incentive protocols in crowdsourcing applications , 2011, 2012 Proceedings IEEE INFOCOM.

[15]  Karim R. Lakhani,et al.  TopCoder (A): Developing Software through Crowdsourcing , 2010 .

[16]  Arpita Ghosh Social computing and user-generated content: a game-theoretic approach , 2012, SECO.

[17]  Balasubramanian Sivan,et al.  Optimal Crowdsourcing Contests , 2011, Encyclopedia of Algorithms.

[18]  Abbas Rajabifard,et al.  To crowdsource or not to crowdsource , 2012 .

[19]  Xi Fang,et al.  Crowdsourcing to smartphones: incentive mechanism design for mobile phone sensing , 2012, Mobicom '12.