Combining gaze input and touch surface input for user interfaces in augmented and/or virtual reality
In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor ma...
Saved in:
Main Authors | , , |
---|---|
Format | Patent |
Language | English |
Published |
30.04.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input. |
---|---|
Bibliography: | Application Number: US201615386594 |