Introducing the NEMO-Lowlands iconic gesture dataset, collected through a gameful human–robot interaction

This paper describes a novel dataset of iconic gestures, together with a publicly available robot-based elicitation method to record these gestures, which consists of playing a game of charades with a humanoid robot. The game was deployed at a science museum (NEMO) and a large popular music festival...

Full description

Saved in:
Bibliographic Details
Published inBehavior research methods Vol. 53; no. 3; pp. 1353 - 1370
Main Authors de Wit, Jan, Krahmer, Emiel, Vogt, Paul
Format Journal Article
LanguageEnglish
Published New York Springer US 01.06.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper describes a novel dataset of iconic gestures, together with a publicly available robot-based elicitation method to record these gestures, which consists of playing a game of charades with a humanoid robot. The game was deployed at a science museum (NEMO) and a large popular music festival (Lowlands) in the Netherlands. This resulted in recordings of 428 participants, both adults and children, performing 3715 silent iconic gestures for 35 different objects in a naturalistic setting. Our dataset adds to existing collections of iconic gesture recordings in two important ways. First, participants were free to choose how they represented the broad concepts using gestures, and they were asked to perform a second attempt if the robot did not recognize their gesture the first time. This provides insight into potential repair strategies that might be used. Second, by making the interactive game available we enable other researchers to collect additional recordings, for different concepts, and in diverse cultures or contexts. This can be done in a consistent manner because a robot is used as a confederate in the elicitation procedure, which ensures that every data collection session plays out in the same way. The current dataset can be used for research into human gesturing behavior, and as input for the gesture recognition and production capabilities of robots and virtual agents.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-020-01487-0