Knowledge based quality analysis of crowdsourced software development platforms

As an emerging and promising approach, crowdsourcing-based software development has become popular in many domains due to the participation of talented pool of developers in the contests, and to promote the ability of requesters (or customers) to choose the ‘wining’ solution with respect to their desired quality levels. However, due to lack of a central mechanism for team formation, continuity in the developer’s work on consecutive tasks and risk of noise in submissions of a contest, there is a gap between the requesters of a domain and their quality concerns related to the adaptation of a crowdsourcing-based software development platform. In order to address concerns and aid requesters, we describe three measures; Quality of Registrant Developers (QRD), Quality of Contest (QC) and Quality of Support (QS) to compute and predict the quality of a crowdsourcing-based platform through historical information on its completed tasks. We evaluate the capacity of the QRD, QC and QS as assessors to predict the quality. Subsequently, we implement a crawler to mine the information of completed development tasks from the TopCoder platform to inspect the proposed measures. The promising results of our QRD, QC, and QS measures suggest to use the proposed measures to the requesters and researchers of other domains such as pharmaceutical research and development, in order to investigate and predict the quality of crowdsourcing-based software development platforms.

[1]  Björn Hartmann,et al.  Collaboratively crowdsourcing workflows with turkomatic , 2012, CSCW.

[2]  Klaas-Jan Stol,et al.  Two's company, three's a crowd: a case study of crowdsourcing software development , 2014, ICSE.

[3]  Arif Ali Khan,et al.  Software design patterns classification and selection using text categorization approach , 2017, Appl. Soft Comput..

[4]  Scott R. Klemmer,et al.  Shepherding the crowd yields better work , 2012, CSCW.

[5]  Bashar Alsadik Crowdsource and web-published videos for 3D documentation of cultural heritage objects , 2016 .

[6]  Eric Schenk,et al.  Towards a characterization of crowdsourcing practices , 2011 .

[7]  Mark Harman,et al.  A survey of the use of crowdsourcing in software engineering , 2017, J. Syst. Softw..

[8]  Jan Marco Leimeister,et al.  Leveraging the Wisdom of Crowds: Designing an IT-Supported Ideas Competition for an ERP Software Company , 2008, Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS 2008).

[9]  Eric Schenk,et al.  Crowdsourcing: What can be Outsourced to the Crowd, and Why ? , 2009 .

[10]  E. Bonabeau Decisions 2.0: the power of collective intelligence , 2009 .

[11]  Arif Ali Khan,et al.  A Methodology to Automate the Selection of Design Patterns , 2016, 2016 IEEE 40th Annual Computer Software and Applications Conference (COMPSAC).

[12]  Alexander Maedche,et al.  Gamified crowdsourcing: Conceptualization, literature review, and future agenda , 2017, Int. J. Hum. Comput. Stud..

[13]  Brian Fitzgerald,et al.  Understanding open source software development , 2002 .

[14]  Daren C. Brabham THE MYTH OF AMATEUR CROWDS , 2012 .

[15]  Awais Ahmad,et al.  Implications of deep learning for the automation of design patterns organization , 2017, J. Parallel Distributed Comput..