This presentation looks into the use of crowd-sourcing as a means to handle Linked Data quality problems that are challenging to solve automatically. The authors analyzed the most common errors encountered in Linked Data sources and classified them according to the extent to which they are likely to be amenable to a specific form of crowd-sourcing. Based on this analysis, they implemented a quality assessment methodology for Linked Data that leverages the wisdom of the crowds in different ways. Also available as PDF: http://videolectures.net/site/normal_dl/tag=817822/iswc2013_acosta_quality_assessment_01.pdf

URL: http://videolectures.net/iswc2013_acosta_quality_assessment/
Keywords: Dataset, RDF, Triple, DBpedia
Author: Acosta, Maribel
Publisher: videolectures.net
Date created: 2013-11-28 05:00:00.000
Language: http://id.loc.gov/vocabulary/iso639-2/eng
Time required: P15M
Educational use: professionalDevelopment
Educational audience: generalPublic
Interactivity type: expositive