Enhancing the Applicability of Sign Language Translation

This paper addresses a significant problem in American Sign Language (ASL) translation systems that has been overlooked. Current designs collect excessive sensing data for each word and treat every sentence as new, requiring the collection of sensing data from scratch. This approach is time-consumin...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on mobile computing Vol. 23; no. 9; pp. 8634 - 8648
Main Authors Li, Jiao, Xu, Jiakai, Liu, Yang, Xu, Weitao, Li, Zhenjiang
Format Magazine Article
LanguageEnglish
Published IEEE 01.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper addresses a significant problem in American Sign Language (ASL) translation systems that has been overlooked. Current designs collect excessive sensing data for each word and treat every sentence as new, requiring the collection of sensing data from scratch. This approach is time-consuming, taking hours to half a day to complete the data collection process for each user. As a result, it creates an unnecessary burden on end-users and hinders the widespread adoption of ASL systems. In this study, we identify the root cause of this issue and propose GASLA -a wearable sensor-based solution that automatically generates sentence-level sensing data from word-level data. An acceleration approach is further proposed to optimize the data generation speed. Moreover, due to the gap between the generated sentence data and directly collected sentence data, a template strategy is proposed to make the generated sentences more similar to the collected sentence. The generated data can be used to train ASL systems effectively while reducing overhead costs significantly. GASLA offers several benefits over current approaches: it reduces initial setup time and future new-sentence addition overhead; it requires only two samples per sentence compared to around ten samples in current systems; and it improves overall performance significantly.
ISSN:1536-1233
1558-0660
DOI:10.1109/TMC.2024.3350111