ARtention: A design space for gaze-adaptive user interfaces in augmented reality
•We present a design space that articulates the role of gaze in AR across three factors.•Factor 1: Reality–virtuality continuum transitions between reality and AR content.•Factor 2: Information level transitions from concise to detailed AR content.•Factor 3: Task transitions in response to standard...
Saved in:
Published in | Computers & graphics Vol. 95; pp. 1 - 12 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Oxford
Elsevier Ltd
01.04.2021
Elsevier Science Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •We present a design space that articulates the role of gaze in AR across three factors.•Factor 1: Reality–virtuality continuum transitions between reality and AR content.•Factor 2: Information level transitions from concise to detailed AR content.•Factor 3: Task transitions in response to standard gaze input.•We successfully design, implement, and evaluate three AR applications based on these.
[Display omitted]
Augmented Reality (AR) headsets extended with eye-tracking, a promising input technology for its natural and implicit nature, open a wide range of new interaction capabilities for everyday use. In this paper we present ARtention, a design space for gaze interaction specifically tailored for in-situ AR information interfaces. It highlights three important dimensions to consider in the UI design of such gaze-enabled applications: transitions from reality to the virtual interface, from single- to multi-layer content, and from information consumption to selection tasks. Such transitional aspects bring previously isolated gaze interaction concepts together to form a unified AR space, enabling more advanced application control seamlessly mediated by gaze. We describe these factors in detail. To illustrate how the design space can be used, we present three prototype applications and report informal user feedback obtained from different scenarios: a conversational UI, viewing a 3D visualization, and browsing items for shopping. We conclude with design considerations derived from our development and evaluation of the prototypes. We expect these to be valuable for researchers and designers investigating the use of gaze input in AR systems and applications. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0097-8493 1873-7684 |
DOI: | 10.1016/j.cag.2021.01.001 |