Multiview Multitask Gaze Estimation With Deep Convolutional Neural Networks

Gaze estimation, which aims to predict gaze points with given eye images, is an important task in computer vision because of its applications in human visual attention understanding. Many existing methods are based on a single camera, and most of them only focus on either the gaze point estimation o...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 30; no. 10; pp. 3010 - 3023
Main Authors Lian, Dongze, Hu, Lina, Luo, Weixin, Xu, Yanyu, Duan, Lixin, Yu, Jingyi, Gao, Shenghua
Format Journal Article
LanguageEnglish
Published United States IEEE 01.10.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gaze estimation, which aims to predict gaze points with given eye images, is an important task in computer vision because of its applications in human visual attention understanding. Many existing methods are based on a single camera, and most of them only focus on either the gaze point estimation or gaze direction estimation. In this paper, we propose a novel multitask method for the gaze point estimation using multiview cameras. Specifically, we analyze the close relationship between the gaze point estimation and gaze direction estimation, and we use a partially shared convolutional neural networks architecture to simultaneously estimate the gaze direction and gaze point. Furthermore, we also introduce a new multiview gaze tracking data set that consists of multiview eye images of different subjects. As far as we know, it is the largest multiview gaze tracking data set. Comprehensive experiments on our multiview gaze tracking data set and existing data sets demonstrate that our multiview multitask gaze point estimation solution consistently outperforms existing methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2018.2865525