Gaze prediction using machine learning for dynamic stereo manipulation in games

Comfortable, high-quality 3D stereo viewing is becoming a requirement for interactive applications today. Previous research shows that manipulating disparity can alleviate some of the discomfort caused by 3D stereo, but it is best to do this locally, around the object the user is gazing at. The main...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the Workshop on Future Trends of Distributed Computing Systems pp. 113 - 120
Main Authors Koulieris, George Alex, Drettakis, George, Cunningham, Douglas, Mania, Katerina
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.03.2016
Subjects
Online AccessGet full text
ISSN2375-5334
DOI10.1109/VR.2016.7504694

Cover

Loading…
More Information
Summary:Comfortable, high-quality 3D stereo viewing is becoming a requirement for interactive applications today. Previous research shows that manipulating disparity can alleviate some of the discomfort caused by 3D stereo, but it is best to do this locally, around the object the user is gazing at. The main challenge is thus to develop a gaze predictor in the demanding context of real-time, heavily task-oriented applications such as games. Our key observation is that player actions are highly correlated with the present state of a game, encoded by game variables. Based on this, we train a classifier to learn these correlations using an eye-tracker which provides the ground-truth object being looked at. The classifier is used at runtime to predict object category - and thus gaze - during game play, based on the current state of game variables. We use this prediction to propose a dynamic disparity manipulation method, which provides rich and comfortable depth. We evaluate the quality of our gaze predictor numerically and experimentally, showing that it predicts gaze more accurately than previous approaches. A subjective rating study demonstrates that our localized disparity manipulation is preferred over previous methods.
ISSN:2375-5334
DOI:10.1109/VR.2016.7504694