Real-time Gaze Tracking with Head-eye Coordination for Head-mounted Displays

High-accuracy, low-latency gaze tracking is becoming one of the indispensable features in augmented reality (AR) head-mounted devices (HMDs). Researchers have proposed different approaches to predict gaze positions from eye images. However, since only the eye modality is focused, these appearance-ba...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) pp. 82 - 91
Main Authors Chen, Lingling, Li, Yingxi, Bai, Xiaowei, Wang, Xiaodong, Hu, Yongqiang, Song, Mingwu, Xie, Liang, Yan, Ye, Yin, Erwei
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:High-accuracy, low-latency gaze tracking is becoming one of the indispensable features in augmented reality (AR) head-mounted devices (HMDs). Researchers have proposed different approaches to predict gaze positions from eye images. However, since only the eye modality is focused, these appearance-based algorithms are still struggle to trade off the accuracy and running speed in HMDs. In this paper, we propose a lightweight multi-modal network (HE-Tracker) to regress gaze positions. By fusing head-movement features with eye features, HE-Tracker achieves comparable accuracy (3.655° in all subjects) and 27 \times speedup (48 fps in the specialized AR HMD) compared to the state-of-the-art gaze tracking algorithm. We further demonstrate that when applying our head-eye coordination strategy to other baseline models, all these models achieve at least 6.36% performance improvement without a pronounced effect on running speed. Moreover, we construct HE-Gaze, the first multi-modal dataset with eye images and head-movement data for near-eye gaze tracking. This dataset is currently made of 757,360 frames and 15 persons, providing an opportunity to foster research in multi-modal gaze tracking approaches. Our dataset is available at DOWNLOAD LINK 1 .
DOI:10.1109/ISMAR55827.2022.00022