Grasping Force Estimation by sEMG Signals and Arm Posture: Tensor Decomposition Approach
Grasping force estimation using surface Electromyography (sEMG) has been actively investigated as it can increase the manipulability and dexterity of prosthetic hands and robotic hands. Most of the current studies in this area only focus on finding the relationship between sEMG signals and the grasp...
Saved in:
Published in | Journal of bionics engineering Vol. 16; no. 3; pp. 455 - 467 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Singapore
Springer Singapore
01.05.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Grasping force estimation using surface Electromyography (sEMG) has been actively investigated as it can increase the manipulability and dexterity of prosthetic hands and robotic hands. Most of the current studies in this area only focus on finding the relationship between sEMG signals and the grasping force without considering the arm posture. Therefore, regression models are not suitable to predict grasping force in various arm postures. In this paper, a method to predict the grasping force from sEMG signals and various grasping postures is developed. The proposed algorithm uses a tensor algebra to train a multi-factor model relevant to sEMG signals corresponding to various grasping forces and postures of the wrist and forearm in multiple dimensions. The multi-factor model is then decomposed into four independent factor spaces of the grasping force, sEMG signals, wrist posture, and forearm posture. Moreover, when a participant executes a new posture, new factors for the wrist and forearm are interpolated in the factor spaces. Thus, the grasping force with various postures can be predicted by combining these factors. The effectiveness of the proposed method is verified through experiments with ten healthy subjects, demonstrating the higher performance of proposed grasping force prediction method than the previous algorithm. |
---|---|
ISSN: | 1672-6529 2543-2141 |
DOI: | 10.1007/s42235-019-0037-0 |