Vid2Real HRI: Align video-based HRI study designs with real-world settings
HRI research using autonomous robots in real-world settings can produce results with the highest ecological validity of any study modality, but many difficulties limit such studies' feasibility and effectiveness. We propose Vid2Real HRI, a research framework to maximize real-world insights offe...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
23.03.2024
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.2403.15798 |
Cover
Loading…
Summary: | HRI research using autonomous robots in real-world settings can produce
results with the highest ecological validity of any study modality, but many
difficulties limit such studies' feasibility and effectiveness. We propose
Vid2Real HRI, a research framework to maximize real-world insights offered by
video-based studies. The Vid2Real HRI framework was used to design an online
study using first-person videos of robots as real-world encounter surrogates.
The online study ($n = 385$) distinguished the within-subjects effects of four
robot behavioral conditions on perceived social intelligence and human
willingness to help the robot enter an exterior door. A real-world,
between-subjects replication ($n = 26$) using two conditions confirmed the
validity of the online study's findings and the sufficiency of the participant
recruitment target ($22$) based on a power analysis of online study results.
The Vid2Real HRI framework offers HRI researchers a principled way to take
advantage of the efficiency of video-based study modalities while generating
directly transferable knowledge of real-world HRI. Code and data from the study
are provided at https://vid2real.github.io/vid2realHRI |
---|---|
DOI: | 10.48550/arxiv.2403.15798 |