Today's Web of Data is noisy. Linked Data often needs extensive preprocessing to enable efficient use of heterogeneous resources. While consistent and valid data provides the key to efficient data processing and aggregation we are facing two main challenges: (1st) Identification of erroneous facts and tracking their origins in dynamically connected datasets is a difficult task, and (2nd) efforts in the curation of deficient facts in Linked Data are exchanged rather rarely. Since erroneous data often is duplicated and (re-)distributed by mashup applications it is not only the responsibility of a few original publishers to keep their data tidy, but progresses to be a mission for all distributers and consumers of Linked Data too. We present a new approach to expose and to reuse patches on erroneous data to enhance and to add quality information to the Web of Data. The feasibility of our approach is demonstrated by example of a collaborative game that patches statements in DBpedia data and provides notifications for relevant changes.
[1]
Andreas Harth,et al.
Weaving the Pedantic Web
,
2010,
LDOW.
[2]
Jens Lehmann,et al.
LODStats - An Extensible Framework for High-Performance Dataset Analytics
,
2012,
EKAW.
[3]
Martin Hepp,et al.
Games with a Purpose for the Semantic Web
,
2008,
IEEE Intelligent Systems.
[4]
Jens Lehmann,et al.
Update Strategies for DBpedia Live
,
2010,
SFSW.
[5]
Laura A. Dabbish,et al.
Labeling images with a computer game
,
2004,
AAAI Spring Symposium: Knowledge Collection from Volunteer Contributors.
[6]
Olaf Hartig.
Provenance Information in the Web of Data
,
2009,
LDOW.
[7]
Elena Paslaru Bontas Simperl,et al.
Human Intelligence in the Process of Semantic Content Creation
,
2010,
World Wide Web.
[8]
Harald Sack,et al.
WhoKnows? Evaluating linked data heuristics with a quiz that cleans up DBpedia
,
2011,
Interact. Technol. Smart Educ..