Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach
Recently, in the attempt to increase blind people autonomy and improve their quality of life, a lot of effort has been devoted to develop technological travel aids. These systems can surrogate spatial information about the environment and deliver it to end-users through sensory substitution (auditor...
Saved in:
Published in | IEEE transactions on haptics Vol. 14; no. 1; pp. 109 - 122 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Recently, in the attempt to increase blind people autonomy and improve their quality of life, a lot of effort has been devoted to develop technological travel aids. These systems can surrogate spatial information about the environment and deliver it to end-users through sensory substitution (auditory, haptic). However, despite the promising research outcomes, these solutions have met scarce acceptance in real-world. Often, this is also due to the limited involvement of real end users in the conceptual and design phases. In this article, we propose a novel indoor navigation system based on wearable haptic technologies. All the developmental phases were driven by continuous feedback from visually impaired persons. The proposed travel aid system consists of a RGB-D camera, a processing unit to compute visual information for obstacle avoidance, and a wearable device, which can provide normal and tangential force cues for guidance in an unknown indoor environment. Experiments with blindfolded subjects and visually impaired participants show that our system could be an effective support during indoor navigation, and a viable tool for training blind people to the usage of travel aids. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 1939-1412 2329-4051 2329-4051 |
DOI: | 10.1109/TOH.2020.2996748 |