Driver's gaze zone estimation by transfer learning
Estimating driver's gaze zone has very important role to support advanced driver assistant system (ADAS). The gaze estimation can monitor the driver focus and indirectly control the user interface/user experience (UI/UX) on a windshield using augmented reality-head up display (AR-HUD). However,...
Saved in:
Published in | 2018 IEEE International Conference on Consumer Electronics (ICCE) pp. 1 - 5 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.01.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Estimating driver's gaze zone has very important role to support advanced driver assistant system (ADAS). The gaze estimation can monitor the driver focus and indirectly control the user interface/user experience (UI/UX) on a windshield using augmented reality-head up display (AR-HUD). However, to train gaze zone estimator as a classification task, someone pays huge costs to gather a large amount of annotated dataset. To reduce the labor work, we used a transfer-learning method using pre-trained CNN model to project the gaze estimation task by regression on mobile devices that have large and reliable dataset into new classification task to overcome lack of annotated dataset for gaze zone estimation. We tested the proposed method to our own building simulation test bed. The result is shown in validation accuracy around 99.01 % and test accuracy with unseen driver around 60.25 % for estimating 10 gaze zones in-vehicle. |
---|---|
ISSN: | 2158-4001 |
DOI: | 10.1109/ICCE.2018.8326308 |