Personalized and Diverse Task Composition in Crowdsourcing

We study task composition in crowdsourcing and the effect of personalization and diversity on performance. A central process in crowdsourcing is task assignment, the mechanism through which workers find tasks. On popular platforms such as Amazon Mechanical Turk, task assignment is facilitated by the...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on knowledge and data engineering Vol. 30; no. 1; pp. 128 - 141
Main Authors Alsayasneh, Maha, Amer-Yahia, Sihem, Gaussier, Eric, Leroy, Vincent, Pilourdault, Julien, Borromeo, Ria Mae, Toyama, Motomichi, Renders, Jean-Michel
Format Journal Article
LanguageEnglish
Published New York IEEE 01.01.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Institute of Electrical and Electronics Engineers
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We study task composition in crowdsourcing and the effect of personalization and diversity on performance. A central process in crowdsourcing is task assignment, the mechanism through which workers find tasks. On popular platforms such as Amazon Mechanical Turk, task assignment is facilitated by the ability to sort tasks by dimensions such as creation date or reward amount. Task composition improves task assignment by producing for each worker, a personalized summary of tasks, referred to as a Composite Task (CT). We propose different ways of producing CTs and formulate an optimization problem that finds for a worker, the most relevant and diverse CTs. We show empirically that workers' experience is greatly improved due to personalization that enforces an adequation of CTs with workers' skills and preferences. We also study and formalize various ways of diversifying tasks in each CT. Task diversity is grounded in organization studies that have shown its impact on worker motivation [33]. Our experiments show that diverse CTs contribute to improving outcome quality. More specifically, we show that while task throughput and worker retention are best with ranked lists, crowdwork quality reaches its best with CTs diversified by requesters, thereby confirming that workers look to expose their "good" work to many requesters.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2017.2755660