An online approach for joint task assignment and worker evaluation in crowd-sourcing
暂无分享,去创建一个
[1] Mark W. Schmidt,et al. Modeling annotator expertise: Learning when everybody knows a bit of something , 2010, AISTATS.
[2] Gerardo Hermosillo,et al. Learning From Crowds , 2010, J. Mach. Learn. Res..
[3] Mausam,et al. Parallel Task Routing for Crowdsourcing , 2014, HCOMP.
[4] Javier R. Movellan,et al. Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise , 2009, NIPS.
[5] David J. C. MacKay,et al. Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.
[6] Jian Peng,et al. Variational Inference for Crowdsourcing , 2012, NIPS.
[7] Panagiotis G. Ipeirotis,et al. Get another label? improving data quality and data mining using multiple, noisy labelers , 2008, KDD.
[8] Jennifer G. Dy,et al. Active Learning from Crowds , 2011, ICML.
[9] Devavrat Shah,et al. Iterative Learning for Reliable Crowdsourcing Systems , 2011, NIPS.
[10] Tom Minka,et al. How To Grade a Test Without Knowing the Answers - A Bayesian Graphical Model for Adaptive Crowdsourcing and Aptitude Testing , 2012, ICML.
[11] Mingyan Liu,et al. An Online Learning Approach to Improving the Quality of Crowd-Sourcing , 2015, SIGMETRICS.
[12] Tom Minka,et al. Expectation Propagation for approximate Bayesian inference , 2001, UAI.
[13] Michael I. Jordan,et al. Loopy Belief Propagation for Approximate Inference: An Empirical Study , 1999, UAI.
[14] Ittai Abraham,et al. Adaptive Crowdsourcing Algorithms for the Bandit Survey Problem , 2013, COLT.
[15] A. P. Dawid,et al. Maximum Likelihood Estimation of Observer Error‐Rates Using the EM Algorithm , 1979 .
[16] Peng Dai,et al. Decision-Theoretic Control of Crowd-Sourced Workflows , 2010, AAAI.
[17] Tom Minka,et al. TrueSkillTM: A Bayesian Skill Rating System , 2006, NIPS.
[18] Rong Jin,et al. Learning with Multiple Labels , 2002, NIPS.
[19] John C. Platt,et al. Learning from the Wisdom of Crowds by Minimax Entropy , 2012, NIPS.
[20] Qiang Liu,et al. Scoring Workers in Crowdsourcing: How Many Control Questions are Enough? , 2013, NIPS.