Nonnaïveté among Amazon Mechanical Turk workers: Consequences and solutions for behavioral researchers

Crowdsourcing services—particularly Amazon Mechanical Turk—have made it easy for behavioral scientists to recruit research participants. However, researchers have overlooked crucial differences between crowdsourcing and traditional recruitment methods that provide unique opportunities and challenges...

Full description

Saved in:
Bibliographic Details
Published inBehavior research methods Vol. 46; no. 1; pp. 112 - 130
Main Authors Chandler, Jesse, Mueller, Pam, Paolacci, Gabriele
Format Journal Article
LanguageEnglish
Published Boston Springer US 01.03.2014
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Crowdsourcing services—particularly Amazon Mechanical Turk—have made it easy for behavioral scientists to recruit research participants. However, researchers have overlooked crucial differences between crowdsourcing and traditional recruitment methods that provide unique opportunities and challenges. We show that crowdsourced workers are likely to participate across multiple related experiments and that researchers are overzealous in the exclusion of research participants. We describe how both of these problems can be avoided using advanced interface features that also allow prescreening and longitudinal data collection. Using these techniques can minimize the effects of previously ignored drawbacks and expand the scope of crowdsourcing as a tool for psychological research.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-013-0365-7