Subject-Independent sEMG Pattern Recognition by Using a Muscle Source Activation Model

The interpretation of surface electromyographic (sEMG) signals facilitates intuitive gesture recognition. However, sEMG signals are highly dependent on measurement conditions. The relationship between sEMG signals and gestures identified from a specific subject cannot be applied to other subjects ow...

Full description

Saved in:
Bibliographic Details
Published inIEEE robotics and automation letters Vol. 5; no. 4; pp. 5175 - 5180
Main Authors Kim, Minjae, Chung, Wan Kyun, Kim, Keehoon
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.10.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The interpretation of surface electromyographic (sEMG) signals facilitates intuitive gesture recognition. However, sEMG signals are highly dependent on measurement conditions. The relationship between sEMG signals and gestures identified from a specific subject cannot be applied to other subjects owing to anatomical differences between the subjects. Furthermore, an sEMG signal varies even according to the electrode placement on the same subject. These limitations reduce the practicability of sEMG signal applications. This letter proposes a subject-independent gesture recognition method based on a muscle source activation model; a reference source model facilitates parameter transfer from a specific subject, i.e., donor to any subject, donee. The proposed method can compensate for the angular difference of the interface between subjects. A donee only needs to perform ulnar deviation for approximately 2s for the overall process. Ten subjects participated in the experiment, and the results show that, in the best configuration, the subject-independent classifier achieved a reasonable accuracy of 78.3% compared with the subject-specific classifier (88.7%) for four wrist/hand motions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2020.3006824