Enhancing eye tracking for nonhuman primates and other subjects unable to follow instructions: Adaptive calibration and validation of Tobii eye trackers with the Titta toolbox

Accurate eye tracking is crucial for gaze-dependent research, but calibrating eye trackers in subjects who cannot follow instructions, such as human infants and nonhuman primates, presents a challenge. Traditional calibration methods rely on verbal instructions, which are ineffective for these popul...

Full description

Saved in:
Bibliographic Details
Published inBehavior research methods Vol. 57; no. 1; p. 4
Main Authors Niehorster, Diederick C., Whitham, Will, Lake, Benjamin R., Schapiro, Steven J., Andolina, Ian M., Yorzinski, Jessica L.
Format Journal Article
LanguageEnglish
Published New York Springer US 04.12.2024
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Accurate eye tracking is crucial for gaze-dependent research, but calibrating eye trackers in subjects who cannot follow instructions, such as human infants and nonhuman primates, presents a challenge. Traditional calibration methods rely on verbal instructions, which are ineffective for these populations. To address this, researchers often use attention-grabbing stimuli in known locations; however, existing software for video-based calibration is often proprietary and inflexible. We introduce an extension to the open-source toolbox Titta—a software package integrating desktop Tobii eye trackers with PsychToolbox experiments—to facilitate custom video-based calibration. This toolbox extension offers a flexible platform for attracting attention, calibrating using flexible point selection, and validating the calibration. The toolbox has been refined through extensive use with chimpanzees, baboons, and macaques, demonstrating its effectiveness across species. Our adaptive calibration and validation procedures provide a standardized method for achieving more accurate gaze tracking, enhancing gaze accuracy across diverse species.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Undefined-3
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-024-02540-y