Training-Free Bayesian Self-Adaptive Classification for sEMG Pattern Recognition Including Motion Transition

A direct, ready-to-use surface electromyogram (sEMG) pattern classification algorithm that does not require prerequisite training, regardless of the user, is proposed herein. In addition to data collection, conventional supervised learning approaches for sEMG require labeling and segmenting the data...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on biomedical engineering Vol. 67; no. 6; pp. 1775 - 1786
Main Authors Park, Seongsik, Chung, Wan Kyun, Kim, Keehoon
Format Journal Article
LanguageEnglish
Published United States IEEE 01.06.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A direct, ready-to-use surface electromyogram (sEMG) pattern classification algorithm that does not require prerequisite training, regardless of the user, is proposed herein. In addition to data collection, conventional supervised learning approaches for sEMG require labeling and segmenting the data and additional time for the learning algorithm. Consequently, these approaches cannot cope well with sEMG patterns during motion transitions of various movement speeds. The proposed unsupervised and self-adaptive method employs an iterative self-adaptive procedure realized by the probabilistic methods of diffusion, updating, and registration to cluster the activation patterns simultaneously in real time, and classify the current sEMG as new clustered patterns. Experiments demonstrated that even for the same motion, the proposed method could autonomously detect changes in muscular activation patterns varying with the speed of motion. Furthermore, some patterns of both steady- and transient-state motions could be distinguished. In addition, it was verified that the classified sEMG pattern could be correlated consistently with the actual motion, thereby realizing a high level of motion classification.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0018-9294
1558-2531
DOI:10.1109/TBME.2019.2947089