Proceedings of the XXth Conference of Open Innovations Association FRUCT (Oct 2021)

Methods for Aggregating Crowdsourced Ontology-based Item Annotations

  • Andrew Ponomarev

DOI
https://doi.org/10.23919/FRUCT53335.2021.9599979
Journal volume & issue
Vol. 30, no. 1
pp. 177 – 183

Abstract

Read online

Crowdsourcing plays an important role in modern IT landscape, enabling the use of human information processing abilities to solve problems that are still hard for machines. One of the specific (and most demanded) applications of crowdsourcing is collecting item annotations, i.e., describing the contents of complex items with a help of labels (tags). Input received from crowdsourcing participants is typically unreliable, therefore, to increase the quality of annotations, each item is typically processed by several participants and the obtained annotations have to be aggregated. The paper considers a special case of annotating, where a set of possible labels, as well as the set of relationships between the labeled items and the labels are defined by an OWL 2 ontology (OWL QL). Such semantic item annotations turn out to be very useful in organizing large collections of items and enabling semantic search in them. In order to increase annotations quality, the paper proposes two aggregation methods OntoVoting and OntoSB, differing in that the first one is agnostic with respect to participants reliability and the second one accounts for variations in reliability. Simulation experiments with ontology-based annotations of varying quality show that the proposed aggregation methods increase the quality of collected ontology-based item annotations.

Keywords