Wearable Eye-Tracking System for Synchronized Multimodal Data Acquisition
Eye-tracking technology is extensively utilized in affective computing research, enabling the investigation of emotional responses through the analysis of eye movements. Integration of eye-tracking with other modalities, allows for the collection of multimodal data, leading to a more comprehensive u...
Saved in:
Published in | IEEE transactions on circuits and systems for video technology Vol. 34; no. 6; pp. 5146 - 5159 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.06.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Eye-tracking technology is extensively utilized in affective computing research, enabling the investigation of emotional responses through the analysis of eye movements. Integration of eye-tracking with other modalities, allows for the collection of multimodal data, leading to a more comprehensive understanding of emotions and their relationship with physiological responses. This paper presents a novel head-mounted eye-tracking system for multimodal data acquisition with a completely redesigned structure and improved performance. We propose a novel method for pupil-fitting with high efficiency and robustness based on deep learning and RANSAC, which gets better performance of pupil segmentation when it is partially occluded, and build a 3D model to obtain gaze points. Existing eye trackers for multi-modal synchronous data collection either have limited device support or suffer from significant synchronization delays. Our proposed hard real-time synchronization mechanism implements microsecond level latency with low cost, which facilitates multimodal analysis for affective computing research. The uniquely designed exterior effectively reduces facial occlusion, making it more comfortable for the wearer while facilitating the capture of facial expressions. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1051-8215 1558-2205 |
DOI: | 10.1109/TCSVT.2023.3332814 |