From Forearm to Wrist: Deep Learning for Surface Electromyography-Based Gesture Recognition

Though the forearm is the focus of the prostheses, myoelectric control with the electrodes on the wrist is more comfortable for general consumers because of its unobtrusiveness and incorporation with the existing wrist-based wearables. Recently, deep learning methods have gained attention for myoele...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 32; p. 1
Main Authors He, Jiayuan, Niu, Xinyue, Zhao, Penghui, Lin, Chuang, Jiang, Ning
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Though the forearm is the focus of the prostheses, myoelectric control with the electrodes on the wrist is more comfortable for general consumers because of its unobtrusiveness and incorporation with the existing wrist-based wearables. Recently, deep learning methods have gained attention for myoelectric control but their performance is unclear on wrist myoelectric signals. This study compared the gesture recognition performance of myoelectric signals from the wrist and forearm between a state-of-the-art method, TDLDA, and four deep learning models, including convolutional neural network (CNN), temporal convolutional network (TCN), gate recurrent unit (GRU) and Transformer. It was shown that with forearm myoelectric signals, the performance between deep learning models and TDLDA was comparable, but with wrist myoelectric signals, the deep learning models outperformed TDLDA significantly with a difference of at least 9%, while the performance of TDLDA was close between the two signal modalities. This work demonstrated the potential of deep learning for wrist-based myoelectric control and would facilitate its application into more sections.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1534-4320
1558-0210
DOI:10.1109/TNSRE.2023.3341220