Using Hierarchical Skills for Optimized Task Assignment in Knowledge-Intensive Crowdsourcing

Besides the simple human intelligence tasks such as image labeling, crowdsourcing platforms propose more and more tasks that require very specific skills, especially in participative science projects. In this context, there is a need to reason about the required skills for a task and the set of available skills in the crowd, in order to increase the resulting quality. Most of the existing solutions rely on unstructured tags to model skills (vector of skills). In this paper we propose to finely model tasks and participants using a skill tree, that is a taxonomy of skills equipped with a similarity distance within skills. This model of skills enables to map participants to tasks in a way that exploits the natural hierarchy among the skills. We illustrate the effectiveness of our model and algorithms through extensive experimentation with synthetic and real data sets.

[1]  Philip Resnik,et al.  Semantic Similarity in a Taxonomy: An Information-Based Measure and its Application to Problems of Ambiguity in Natural Language , 1999, J. Artif. Intell. Res..

[2]  Sihem Amer-Yahia,et al.  Task assignment optimization in knowledge-intensive crowdsourcing , 2015, The VLDB Journal.

[3]  Tova Milo,et al.  OASSIS: query driven crowd mining , 2014, SIGMOD Conference.

[4]  Charu C. Aggarwal,et al.  Recursive Fact-Finding: A Streaming Approach to Truth Estimation in Crowdsourcing Applications , 2013, 2013 IEEE 33rd International Conference on Distributed Computing Systems.

[5]  Sihem Amer-Yahia,et al.  Worker Skill Estimation in Team-Based Tasks , 2015, Proc. VLDB Endow..

[6]  Chris Cornelis,et al.  Whom should I trust?: the impact of key figures on cold start recommendations , 2008, SAC '08.

[7]  Brian J. Ruggeberg,et al.  DOING COMPETENCIES WELL: BEST PRACTICES IN COMPETENCY MODELING , 2011 .

[8]  Barry Smyth,et al.  Case-Based User Profiling for Content Personalisation , 2000, AH.

[9]  Wei Zhang,et al.  A Collective Bayesian Poisson Factorization Model for Cold-start Local Event Recommendation , 2015, KDD.

[10]  Arjen P. de Vries,et al.  Obtaining High-Quality Relevance Judgments Using Crowdsourcing , 2012, IEEE Internet Computing.

[11]  Reynold Cheng,et al.  On Optimality of Jury Selection in Crowdsourcing , 2015, EDBT.

[12]  Wilfred Ng,et al.  SocialTransfer: Transferring Social Knowledge for Cold-Start Cowdsourcing , 2014, CIKM.

[13]  Stuart E. Middleton,et al.  Ontological user profiling in recommender systems , 2004, TOIS.

[14]  Francesco Ricci,et al.  Cold-Start Management with Cross-Domain Collaborative Filtering and Tags , 2013, EC-Web.

[15]  Lei Chen,et al.  Whom to Ask? Jury Selection for Decision Making Tasks on Micro-blog Services , 2012, Proc. VLDB Endow..

[16]  Alessandro Bozzon,et al.  Choosing the right crowd: expert finding in social networks , 2013, EDBT '13.

[17]  Devavrat Shah,et al.  Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems , 2011, Oper. Res..

[18]  Juan-Zi Li,et al.  Expert Finding in a Social Network , 2007, DASFAA.

[19]  Asuman E. Ozdaglar,et al.  Managing Innovation in a Crowd , 2014, EC.

[20]  Steven Skiena,et al.  Lowest common ancestors in trees and directed acyclic graphs , 2005, J. Algorithms.

[21]  Alfred Kobsa User Modeling and User-Adapted Interaction , 2005, User Modeling and User-Adapted Interaction.

[22]  Harold W. Kuhn,et al.  The Hungarian method for the assignment problem , 1955, 50 Years of Integer Programming.

[23]  Reynold Cheng,et al.  Optimizing plurality for human intelligence tasks , 2013, CIKM.

[24]  Michel C. Desmarais,et al.  A review of recent advances in learner and skill modeling in intelligent learning environments , 2012, User Modeling and User-Adapted Interaction.

[25]  Beng Chin Ooi,et al.  CDAS: A Crowdsourcing Data Analytics System , 2012, Proc. VLDB Endow..

[26]  Beng Chin Ooi,et al.  iCrowd: An Adaptive Crowdsourcing Framework , 2015, SIGMOD Conference.

[27]  Wolf-Tilo Balke,et al.  Skill Ontology-Based Model for Quality Assurance in Crowdsourcing , 2014, DASFAA Workshops.