Strategies for Crowdworkers to Overcome Barriers in Competition-based Software Crowdsourcing Development

Crowdsourcing in software development uses a large pool of developers on-demand to outsource parts or the entire software project to a crowd. To succeed, this requires a continuous influx of developers, or simply crowdworkers. However, crowdworkers face many barriers when attempting to participate in software crowdsourcing. Often, these barriers lead to a low number and poor quality of submitted solutions. In our previous work, we identified several barriers faced by crowdworkers including finding a task according to his/her abilities, setting up the environment to perform the task, and managing one's personal time. We also proposed six strategies to overcome or minimize these barriers. In this paper, these six strategies are evaluated questioning Software Crowdsourcing (SW CS) experts. The results show that software crowdsourcing needs to: (i) provide a system that helps matching tasks requirements and crowdworker's profile; (ii) adopt containers or virtual machines to help crowdworkers set up their environment to perform the task, (iii) plan and control crowdworkers' personal time, and (iv) adopt communication channels to allow crowdworkers to clarify questions about the requirements and, as a consequence, finish the tasks.

[1]  Klaas-Jan Stol,et al.  Two's company, three's a crowd: a case study of crowdsourcing software development , 2014, ICSE.

[2]  Ye Yang,et al.  Who Should Take This Task?: Dynamic Decision Support for Crowd Workers , 2016, ESEM.

[3]  Carlos Jensen,et al.  Beyond pretty pictures: Examining the benefits of code visualization for Open Source newcomers , 2009, 2009 5th IEEE International Workshop on Visualizing Software for Understanding and Analysis.

[4]  Sabrina Marczak,et al.  The Good, the Bad and the Ugly: An Onboard Journey in Software Crowdsourcing Competitive Model , 2017, 2017 IEEE/ACM 4th International Workshop on CrowdSourcing in Software Engineering (CSI-SE).

[5]  Lydia B. Chilton,et al.  The labor economics of paid crowdsourcing , 2010, EC '10.

[6]  G. Andrews,et al.  Mja Guidelines for Assessing Qualitative Research Quality in Qualitative Research Criteria for Authors and Assessors in the Submission and Assessment of Qualitative Research Articles for the Medical Journal of Australia , 2022 .

[7]  Sabrina Marczak,et al.  An Empirical Study on Task Documentation in Software Crowdsourcing on TopCoder , 2019, 2019 ACM/IEEE 14th International Conference on Global Software Engineering (ICGSE).

[8]  Chaim Fershtman,et al.  THE TRADEOFF BETWEEN PERFORMANCE AND QUITTING IN HIGH POWER TOURNAMENTS , 2011 .

[9]  Igor Steinmacher,et al.  Barriers Faced by Newcomers to Software-Crowdsourcing Projects , 2017, IEEE Software.

[10]  Wei-Tek Tsai,et al.  Software Crowdsourcing Practices and Research Directions , 2016, 2016 IEEE Symposium on Service-Oriented System Engineering (SOSE).

[11]  Schahram Dustdar,et al.  The Social Routing Principle , 2011, IEEE Internet Computing.

[12]  Erran Carmel,et al.  Software Crowdsourcing Challenges in the Brazilian IT Industry , 2016, ICEIS.

[13]  Gabriele Bavota,et al.  How Developers' Collaborations Identified from Different Sources Tell Us about Code Changes , 2014, 2014 IEEE International Conference on Software Maintenance and Evolution.

[14]  Marco Aurélio Gerosa,et al.  A systematic literature review on the barriers faced by newcomers to open source software projects , 2015, Inf. Softw. Technol..

[15]  Mahmood Hosseini,et al.  The four pillars of crowdsourcing: A reference model , 2014, 2014 IEEE Eighth International Conference on Research Challenges in Information Science (RCIS).

[16]  Igor Steinmacher,et al.  Competence, Collaboration, and Time Management: Barriers and Recommendations for Crowdworkers , 2018, 2018 IEEE/ACM 5th International Workshop on Crowd Sourcing in Software Engineering (CSI-SE).