Gesture recognition based on surface electromyography‐feature image
Summary For the problem of surface electromyography (sEMG) gesture recognition, considering the fact that the traditional machine learning model is susceptible to the sEMG feature extraction method, it is difficult to distinguish the subtle differences between similar gestures. The NinaPro DB1 datas...
Saved in:
Published in | Concurrency and computation Vol. 33; no. 6 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Hoboken, USA
John Wiley & Sons, Inc
25.03.2021
Wiley Subscription Services, Inc |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Summary
For the problem of surface electromyography (sEMG) gesture recognition, considering the fact that the traditional machine learning model is susceptible to the sEMG feature extraction method, it is difficult to distinguish the subtle differences between similar gestures. The NinaPro DB1 dataset is used as the research object, and the sEMG feature image and the Convolutional Neural Network (CNN) are combined to recognize 52 gesture movements. The CNN model effectively solves the limitations of traditional machine learning in sEMG gesture recognition, and combines 1‐dim convolution kernel to extract deep features to improve the recognition effect. Finally, the simulation experiment shows that compared with the accuracy of the raw‐sEMG images based on the CNN and the sEMG‐feature‐images based on the CNN and sEMG based on the traditional machine learning, the multi‐sEMG‐features image based on the CNN is the highest, which coming up to 82.54%. |
---|---|
Bibliography: | Funding information Hubei Provincial Department of Education, D20191105; National Defense Pre‐Research Foundation of Wuhan University of Science and Technology, GF201705; National Natural Science Foundation of China, 51575407; 51505349; 61733011; 41906177; Open Fund of the Key Laboratory for Metallurgical Equipment and Control of Ministry of Education in Wuhan University of Science and Technology, 2018B07 |
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.6051 |