Robust Gaze Point Estimation for Metaverse With Common Mode Features Suppression Network

Gaze point estimation is an essential technology in extended reality devices, and plays a vital role in the metaverse with consumer health. A challenging issue is the great variation underlying the gaze-related encode information, which makes current methods difficult to generate gaze point with sta...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on consumer electronics Vol. 70; no. 1; pp. 2090 - 2098
Main Authors Xu, Xu, Chen, Junxin, Li, Congsheng, Fu, Chong, Yang, Lei, Yan, Yan, Lyu, Zhihan
Format Journal Article
LanguageEnglish
Published New York IEEE 01.02.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gaze point estimation is an essential technology in extended reality devices, and plays a vital role in the metaverse with consumer health. A challenging issue is the great variation underlying the gaze-related encode information, which makes current methods difficult to generate gaze point with stable quality among different persons. To this end, we propose a robust framework to generate high-accuracy gaze point from appearance. Specifically, eyes and face regions are fused into network to enrich high-level features. Then, we develop a common mode features suppression network (CMFS-Net) to predict the gaze bias between the input image and standard image. It is based on the differential amplifier. In addition, a Point-Mean algorithm is designed to generate the estimated gaze point from candidate points. The performance of the CMFS-Net is evaluated on three gaze estimation datasets, GazePC, GazeCapture, and Rice TabletGaze. Among them, the GazePC is collected by ourselves which composes 41.25K images from 165 participants. Experimental results demonstrate that our model is effective and robust for appearance-based gaze point estimation, and has advantages over peer methods.
ISSN:0098-3063
1558-4127
1558-4127
DOI:10.1109/TCE.2024.3351190