PLoS ONE (Jan 2019)

Producing knowledge by admitting ignorance: Enhancing data quality through an "I don't know" option in citizen science.

  • Marina Torre,
  • Shinnosuke Nakayama,
  • Tyrone J Tolbert,
  • Maurizio Porfiri

DOI
https://doi.org/10.1371/journal.pone.0211907
Journal volume & issue
Vol. 14, no. 2
p. e0211907

Abstract

Read online

The "noisy labeler problem" in crowdsourced data has attracted great attention in recent years, with important ramifications in citizen science, where non-experts must produce high-quality data. Particularly relevant to citizen science is dynamic task allocation, in which the level of agreement among labelers can be progressively updated through the information-theoretic notion of entropy. Under dynamic task allocation, we hypothesized that providing volunteers with an "I don't know" option would contribute to enhancing data quality, by introducing further, useful information about the level of agreement among volunteers. We investigated the influence of an "I don't know" option on the data quality in a citizen science project that entailed classifying the image of a highly polluted canal into "threat" or "no threat" to the environment. Our results show that an "I don't know" option can enhance accuracy, compared to the case without the option; such an improvement mostly affects the true negative rather than the true positive rate. In an information-theoretic sense, these seemingly meaningless blank votes constitute a meaningful piece of information to help enhance accuracy of data in citizen science.