Evaluation of LSTM for predicting grip strength using electromyography: a comparison of setups and methods
Despite decades of research in prosthetics and myocontrol, using electromyography (EMG) to accurately predict the force a user grasps an object with is still a subject of investigation. Although the problem seems trivial, the optimal EMG setup, able to deliver high prediction accuracy at a minimal e...
Saved in:
Published in | Neural computing & applications Vol. 37; no. 21; pp. 16461 - 16485 |
---|---|
Main Authors | , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Springer London
01.07.2025
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Despite decades of research in prosthetics and myocontrol, using electromyography (EMG) to accurately predict the force a user grasps an object with is still a subject of investigation. Although the problem seems trivial, the optimal EMG setup, able to deliver high prediction accuracy at a minimal economic and computational cost needs to be found. In this work, we compare several EMG setups consisting of one to eight sensors and deep learning methods to find out which combination is most convenient. In particular, we compare long short-term memory (LSTM), together with a stacked autoencoder (LSTM–SAE) and an attention mechanism (LSTMATT). Our experimental results reveal that, while the best performance is attained by LSTM–SAE (coefficient of correlation
0.9867
±
0.0087
, coefficient of determination
0.9676
±
0.0489
, normalized root mean square error
0.048
±
0.0213
), statistically significant differences can only be found when the number of sensors is drastically reduced, namely to 2 sensors, in which case, anyway, the performance is still close to optimal and even surpasses state-of-the-art methods. Further research will focus on testing the optimal approach and setup online on amputated users using prosthetic hardware in daily living activities. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-025-11337-9 |