Combining gaze input and touch surface input for user interfaces in augmented and/or virtual reality

In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor ma...

Full description

Saved in:
Bibliographic Details
Main Authors McKenzie, Chris, Raffle, Hayes S, Li, Chun Yat Frank
Format Patent
LanguageEnglish
Published 30.04.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input.
Bibliography:Application Number: US201615386594