An augmented reality sign-reading assistant for users with reduced vision

People typically rely heavily on visual information when finding their way to unfamiliar locations. For individuals with reduced vision, there are a variety of navigational tools available to assist with this task if needed. However, for wayfinding in unfamiliar indoor environments the applicability...

Full description

Saved in:
Bibliographic Details
Published inPloS one Vol. 14; no. 1; p. e0210630
Main Authors Huang, Jonathan, Kinateder, Max, Dunn, Matt J, Jarosz, Wojciech, Yang, Xing-Dong, Cooper, Emily A
Format Journal Article
LanguageEnglish
Published United States Public Library of Science 16.01.2019
Public Library of Science (PLoS)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:People typically rely heavily on visual information when finding their way to unfamiliar locations. For individuals with reduced vision, there are a variety of navigational tools available to assist with this task if needed. However, for wayfinding in unfamiliar indoor environments the applicability of existing tools is limited. One potential approach to assist with this task is to enhance visual information about the location and content of existing signage in the environment. With this aim, we developed a prototype software application, which runs on a consumer head-mounted augmented reality (AR) device, to assist visually impaired users with sign-reading. The sign-reading assistant identifies real-world text (e.g., signs and room numbers) on command, highlights the text location, converts it to high contrast AR lettering, and optionally reads the content aloud via text-to-speech. We assessed the usability of this application in a behavioral experiment. Participants with simulated visual impairment were asked to locate a particular office within a hallway, either with or without AR assistance (referred to as the AR group and control group, respectively). Subjective assessments indicated that participants in the AR group found the application helpful for this task, and an analysis of walking paths indicated that these participants took more direct routes compared to the control group. However, participants in the AR group also walked more slowly and took more time to complete the task than the control group. The results point to several specific future goals for usability and system performance in AR-based assistive tools.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Current address: National Research Council Canada – Fire Safety, Ottawa, Canada
Competing Interests: This research was supported by gifts provided by Microsoft (to EAC, WJ, and XDY) and Oculus (to EAC). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. This does not alter the authors’ adherence to PLOS ONE policies on sharing data and materials.
Current address: School of Optometry, Vision Science Program, University of California, Berkeley, Berkeley, California, United States of America
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0210630