Knapsack problems range over a large sphere of real world challenges [?]. For example, every year a professor has to decide her new “squad” of students/staff from possibly hundreds of candidates, while having a restricted budget of funding in consideration. Moreover, in many cases, she has to resort to her colleagues and senior students to make comparisons among the candidates. The difficulties of such tasks are mainly three-fold: 1) the knowledge about the candidates are distributed among a crowd; 2) the underlying factors are human-intrinsic and hard to be formatted; 3) the size of candidates exceeds the capacity of human for a one-shot decision. Other examples in this category include gear set preparation for a venture trip, syllabus design for a popular course and inventory design for goods shelf, where the two difficulties are commonly observed. Consequently, a person may be heavily entangled to work out a final decision, which may even be inaccurate. Driven by this demand, in this demo, we present C-DMr - a Crowd-powered Decision Maker that incorporates the wisdom of the informed crowds to solve such real world Knapsack Problems. The core module of this web-based system is a set of algorithms along with a novel interactive interface. The interface incrementally presents comparison jobs and motivates the crowd to participate with a rewarding mechanism, and the set of algorithms solves the Knapsack Problem given only pairwise preferences among candidates. We demonstrate the novelty and usefulness of C-DMr by forming a aforementioned “squad” for a recruiting professor. Specifically four functionalities are shown: 1) a Candidates Entrance that collects the information about all candidates; 2) a Jury Trial that facilitates informed crowds to contribute preferences; 3) an Knapsack Analyzer that measures the on-going “squad”; and 4) a Consultant that recommends a final set of candidates to the professor.
[1]
L. S. Shapley,et al.
College Admissions and the Stability of Marriage
,
2013,
Am. Math. Mon..
[2]
David R. Karger,et al.
Human-powered Sorts and Joins
,
2011,
Proc. VLDB Endow..
[3]
Lei Chen,et al.
CrowdCleaner: Data cleaning for multi-version data on the web via crowdsourcing
,
2014,
2014 IEEE 30th International Conference on Data Engineering.
[4]
Lei Chen,et al.
WiseMarket: a new paradigm for managing wisdom of online social users
,
2013,
KDD.
[5]
Tim Kraska,et al.
CrowdDB: answering queries with crowdsourcing
,
2011,
SIGMOD '11.
[6]
Lei Chen,et al.
Whom to Ask? Jury Selection for Decision Making Tasks on Micro-blog Services
,
2012,
Proc. VLDB Endow..
[7]
Michael C. Pyryt.
Human cognitive abilities: A survey of factor analytic studies
,
1998
.
[8]
P. Fishburn.
Signed orders and power set extensions
,
1992
.
[9]
Beng Chin Ooi,et al.
CDAS: A Crowdsourcing Data Analytics System
,
2012,
Proc. VLDB Endow..
[10]
Aditya G. Parameswaran,et al.
Answering Queries using Humans, Algorithms and Databases
,
2011,
CIDR.