Assignment strategies for ground truths in the crowdsourcing of labeling tasks

•Algorithms for exploiting ground truths in crowdsourcing are developed.•Quite general workers are assumed in the development.•The algorithms can be of benefit to general EM algorithm-based approaches.•Evaluation demonstrates that our algorithms can work well in various situations. It is expected th...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of systems and software Vol. 126; pp. 113 - 126
Main Authors Kubota, Takuya, Aritsugi, Masayoshi
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.04.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•Algorithms for exploiting ground truths in crowdsourcing are developed.•Quite general workers are assumed in the development.•The algorithms can be of benefit to general EM algorithm-based approaches.•Evaluation demonstrates that our algorithms can work well in various situations. It is expected that ground truths can result in many good labels in the crowdsourcing of labeling tasks. However, the use of ground truths has so far not been adequately addressed. In this paper, we develop algorithms that determine the number of ground truths that are necessary. We determine this number by iteratively calculating the expected quality of labels for tasks with various sets of ground truths, and then comparing the quality with the limit of the estimated label quality expected to be obtained by crowdsourcing. We assume that each worker has a different unknown labeling ability and performs a different number of tasks. Under this assumption, we develop assignment strategies for ground truths based on the estimated confidence intervals of the workers. Our algorithms can utilize different approaches based on the expectation maximization to estimate good-quality consensus labels. An experimental evaluation demonstrates that our algorithms work well in various situations.
ISSN:0164-1212
1873-1228
DOI:10.1016/j.jss.2016.06.061