A Wearable Assistive Device for Blind Pedestrians Using Real-Time Object Detection and Tactile Presentation

Nowadays, improving the traffic safety of visually impaired people is a topic of widespread concern. To help avoid the risks and hazards of road traffic in their daily life, we propose a wearable device using object detection techniques and a novel tactile display made from shape-memory alloy (SMA)...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno vSensors (Basel, Switzerland) Ročník 22; číslo 12; s. 4537
Hlavní autoři Shen, Junjie, Chen, Yiwen, Sawada, Hideyuki
Typ dokumentu Článek v odborném časopise
Jazykangličtina
Vydáno Basel MDPI AG 16.06.2022
MDPI
Témata
On-line přístupZískat plný text
ISSN1424-8220
1424-8220
DOI10.3390/s22124537

Obálka

Načítání...
Více informací
Shrnutí:Nowadays, improving the traffic safety of visually impaired people is a topic of widespread concern. To help avoid the risks and hazards of road traffic in their daily life, we propose a wearable device using object detection techniques and a novel tactile display made from shape-memory alloy (SMA) actuators. After detecting obstacles in real-time, the tactile display attached to a user’s hands presents different tactile sensations to show the position of the obstacles. To implement the computation-consuming object detection algorithm in a low-memory mobile device, we introduced a slimming compression method to reduce 90% of the redundant structures of the neural network. We also designed a particular driving circuit board that can efficiently drive the SMA-based tactile displays. In addition, we also conducted several experiments to verify our wearable assistive device’s performance. The results of the experiments showed that the subject was able to recognize the left or right position of a stationary obstacle with 96% accuracy and also successfully avoided collisions with moving obstacles by using the wearable assistive device.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s22124537