Decoding Electromyographic Signal with Multiple Labels for Hand Gesture Recognition
Surface electromyography (sEMG) is a significant interaction signal in the fields of human-computer interaction and rehabilitation assessment, as it can be used for hand gesture recognition. This paper proposes a novel MLHG model to improve the robustness of sEMG-based hand gesture recognition. The...
Saved in:
Published in | IEEE signal processing letters Vol. 30; pp. 1 - 5 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 1070-9908 1558-2361 |
DOI | 10.1109/LSP.2023.3264417 |
Cover
Loading…
Abstract | Surface electromyography (sEMG) is a significant interaction signal in the fields of human-computer interaction and rehabilitation assessment, as it can be used for hand gesture recognition. This paper proposes a novel MLHG model to improve the robustness of sEMG-based hand gesture recognition. The model utilizes multiple labels to decode the sEMG signals from two different perspectives. In the first view, the sEMG signals are transformed into motion signals using the proposed FES-MSCNN (Feature Extraction of sEMG with Multiple Sub-CNN modules). Furthermore, a discriminator FEM-SAGE (Feature Extraction of Motion with graph SAmple and aggreGatE model) is employed to judge the authenticity of the generated motion data. The deep features of the motion signals are extracted using the FEM-SAGE model. In the second view, the deep features of the sEMG signals are extracted using the FES-MSCNN model. The extracted features of the sEMG signals and the generated motion signals are then fused for hand gesture recognition. To evaluate the performance of the proposed model, a dataset containing sEMG signals and multiple labels from 12 subjects has been collected. The experimental results indicate that the MLHG model achieves an accuracy of <inline-formula><tex-math notation="LaTeX">99.26\%</tex-math></inline-formula> for within-session hand gesture recognition, <inline-formula><tex-math notation="LaTeX">78.47\%</tex-math></inline-formula> for cross-time, and <inline-formula><tex-math notation="LaTeX">53.52\%</tex-math></inline-formula> for cross-subject. These results represent a significant improvement compared to using only the gesture labels, with accuracy improvements of <inline-formula><tex-math notation="LaTeX">1.91\%</tex-math></inline-formula>, <inline-formula><tex-math notation="LaTeX">5.35\%</tex-math></inline-formula>, and <inline-formula><tex-math notation="LaTeX">5.25\%</tex-math></inline-formula> in the within-session, cross-time and cross-subject cases, respectively. |
---|---|
AbstractList | Surface electromyography (sEMG) is a significant interaction signal in the fields of human-computer interaction and rehabilitation assessment, as it can be used for hand gesture recognition. This paper proposes a novel MLHG model to improve the robustness of sEMG-based hand gesture recognition. The model utilizes multiple labels to decode the sEMG signals from two different perspectives. In the first view, the sEMG signals are transformed into motion signals using the proposed FES-MSCNN (Feature Extraction of sEMG with Multiple Sub-CNN modules). Furthermore, a discriminator FEM-SAGE (Feature Extraction of Motion with graph SAmple and aggreGatE model) is employed to judge the authenticity of the generated motion data. The deep features of the motion signals are extracted using the FEM-SAGE model. In the second view, the deep features of the sEMG signals are extracted using the FES-MSCNN model. The extracted features of the sEMG signals and the generated motion signals are then fused for hand gesture recognition. To evaluate the performance of the proposed model, a dataset containing sEMG signals and multiple labels from 12 subjects has been collected. The experimental results indicate that the MLHG model achieves an accuracy of <inline-formula><tex-math notation="LaTeX">99.26\%</tex-math></inline-formula> for within-session hand gesture recognition, <inline-formula><tex-math notation="LaTeX">78.47\%</tex-math></inline-formula> for cross-time, and <inline-formula><tex-math notation="LaTeX">53.52\%</tex-math></inline-formula> for cross-subject. These results represent a significant improvement compared to using only the gesture labels, with accuracy improvements of <inline-formula><tex-math notation="LaTeX">1.91\%</tex-math></inline-formula>, <inline-formula><tex-math notation="LaTeX">5.35\%</tex-math></inline-formula>, and <inline-formula><tex-math notation="LaTeX">5.25\%</tex-math></inline-formula> in the within-session, cross-time and cross-subject cases, respectively. Surface electromyography (sEMG) is a significant interaction signal in the fields of human-computer interaction and rehabilitation assessment, as it can be used for hand gesture recognition. This letter proposes a novel MLHG model to improve the robustness of sEMG-based hand gesture recognition. The model utilizes multiple labels to decode the sEMG signals from two different perspectives. In the first view, the sEMG signals are transformed into motion signals using the proposed FES-MSCNN (Feature Extraction of sEMG with Multiple Sub-CNN modules). Furthermore, a discriminator FEM-SAGE (Feature Extraction of Motion with graph SAmple and aggreGatE model) is employed to judge the authenticity of the generated motion data. The deep features of the motion signals are extracted using the FEM-SAGE model. In the second view, the deep features of the sEMG signals are extracted using the FES-MSCNN model. The extracted features of the sEMG signals and the generated motion signals are then fused for hand gesture recognition. To evaluate the performance of the proposed model, a dataset containing sEMG signals and multiple labels from 12 subjects has been collected. The experimental results indicate that the MLHG model achieves an accuracy of [Formula Omitted] for within-session hand gesture recognition, [Formula Omitted] for cross-time, and [Formula Omitted] for cross-subject. These results represent a significant improvement compared to using only the gesture labels, with accuracy improvements of [Formula Omitted], [Formula Omitted], and [Formula Omitted] in the within-session, cross-time and cross-subject cases, respectively. |
Author | Cheng, Long Han, Lijun Li, Zhengwei Zou, Yongxiang Song, Luping |
Author_xml | – sequence: 1 givenname: Yongxiang surname: Zou fullname: Zou, Yongxiang organization: School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China – sequence: 2 givenname: Long orcidid: 0000-0001-7565-8788 surname: Cheng fullname: Cheng, Long organization: School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China – sequence: 3 givenname: Lijun surname: Han fullname: Han, Lijun organization: School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China – sequence: 4 givenname: Zhengwei orcidid: 0000-0002-5148-9200 surname: Li fullname: Li, Zhengwei organization: State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China – sequence: 5 givenname: Luping surname: Song fullname: Song, Luping organization: Shenzhen Sixth People's Hospital (Nanshan Hospital), Huazhong University of Science and Technology Union Shenzhen Hospital, no. 89, Taoyuan Road, Nanshan District, Shenzhen, China |
BookMark | eNp9kD1PwzAQhi0EEm1hZ2CwxNxydr7sEZXSIgWBKMyRYzupqzQOjiPUf4-rdkAMLHc3vM_p1TNG561tNUI3BGaEAL_P128zCjSaRTSNY5KdoRFJEjalUUrOww0ZTDkHdonGfb8FAEZYMkLrRy2tMm2NF42W3tnd3tZOdBsj8drUrWjwt_Eb_DI03nSNxrkoddPjyjq8Eq3CS937wWn8Hv7UrfHGtlfoohJNr69Pe4I-nxYf89U0f10-zx_yqaSc-jClUimISoDUCUSlVCXhTKWxKhWPWBLrkqhExZlmaSyIjhNVKVVVMS1pqWg0QXfHv52zX0PoUWzt4ELlvqAMOOOQAA-p9JiSzva901UhjReHnt4J0xQEioPAIggsDgKLk8AAwh-wc2Yn3P4_5PaIGK31r3ioQWgW_QC9yH8i |
CODEN | ISPLEM |
CitedBy_id | crossref_primary_10_1109_LSP_2023_3337727 crossref_primary_10_1016_j_asoc_2024_112235 crossref_primary_10_1142_S0219519424500386 crossref_primary_10_1109_TNSRE_2023_3342050 crossref_primary_10_1109_TBME_2024_3456235 crossref_primary_10_1109_TIM_2024_3381288 crossref_primary_10_1007_s11760_024_03668_2 crossref_primary_10_1109_LSP_2023_3348298 crossref_primary_10_1007_s42979_024_03396_x crossref_primary_10_1016_j_measurement_2024_115693 |
Cites_doi | 10.1109/LSP.2019.2903334 10.1109/LSP.2016.2636320 10.1109/TNSRE.2020.2986884 10.1109/TII.2019.2931140 10.1109/TITS.2019.2963722 10.1109/ICORR.2017.8009405 10.1016/j.bspc.2022.103981 10.1109/TNSRE.2022.3199809 10.1007/s12652-021-03582-2 10.1007/s11633-022-1350-3 10.1109/JBHI.2020.3009383 10.1109/TCBB.2021.3054738 10.1109/LSP.2016.2590470 10.1109/TITS.2022.3142248 10.1007/s11633-022-1352-1 10.3390/s20072106 10.1109/TNSRE.2022.3156387 10.1109/TAI.2021.3098253 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/LSP.2023.3264417 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Technology Research Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1558-2361 |
EndPage | 5 |
ExternalDocumentID | 10_1109_LSP_2023_3264417 10093127 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Key Research & Development Program grantid: 2022YFB4703204 – fundername: CAS Project for Young Scientists in Basic Research grantid: YSBR-034 |
GroupedDBID | -~X .DC 0R~ 29I 4.4 5GY 6IK 85S 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS F5P HZ~ IFIPE IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS TAE TN5 3EH 5VS AAYJJ AAYXX ABFSI AETIX AGSQL AI. AIBXA ALLEH CITATION E.L EJD H~9 ICLAB IFJZH RIG VH1 7SC 7SP 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c292t-c2cdd60afa0ce503bcdb198d64dbd93854eb1d5d47e864a1e45dfddff42b2bd23 |
IEDL.DBID | RIE |
ISSN | 1070-9908 |
IngestDate | Mon Jun 30 05:53:46 EDT 2025 Thu Apr 24 23:09:19 EDT 2025 Tue Jul 01 02:21:38 EDT 2025 Wed Aug 27 02:25:54 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c292t-c2cdd60afa0ce503bcdb198d64dbd93854eb1d5d47e864a1e45dfddff42b2bd23 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0001-7565-8788 0000-0002-5148-9200 0009-0001-4704-5164 0000-0003-1521-4864 |
PQID | 2809890509 |
PQPubID | 75747 |
PageCount | 5 |
ParticipantIDs | crossref_citationtrail_10_1109_LSP_2023_3264417 ieee_primary_10093127 crossref_primary_10_1109_LSP_2023_3264417 proquest_journals_2809890509 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-01-01 |
PublicationDateYYYYMMDD | 2023-01-01 |
PublicationDate_xml | – month: 01 year: 2023 text: 2023-01-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE signal processing letters |
PublicationTitleAbbrev | LSP |
PublicationYear | 2023 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref12 ref15 gao (ref6) 2022; 69 ref14 ref11 ref10 schapke (ref18) 2022; 19 ref2 ref1 ref17 ref16 ref19 ref8 ref7 ref9 ref4 ref3 ref5 |
References_xml | – ident: ref1 doi: 10.1109/LSP.2019.2903334 – ident: ref5 doi: 10.1109/LSP.2016.2636320 – ident: ref9 doi: 10.1109/TNSRE.2020.2986884 – ident: ref16 doi: 10.1109/TII.2019.2931140 – ident: ref17 doi: 10.1109/TITS.2019.2963722 – ident: ref7 doi: 10.1109/ICORR.2017.8009405 – ident: ref3 doi: 10.1016/j.bspc.2022.103981 – ident: ref4 doi: 10.1109/TNSRE.2022.3199809 – ident: ref11 doi: 10.1007/s12652-021-03582-2 – ident: ref12 doi: 10.1007/s11633-022-1350-3 – volume: 69 start-page: 4588 year: 2022 ident: ref6 article-title: A multi-featured time-frequency neural network system for classifying sEMG publication-title: IEEE Trans Circuits Syst II Exp Briefs – ident: ref14 doi: 10.1109/JBHI.2020.3009383 – volume: 19 start-page: 1615 year: 2022 ident: ref18 article-title: EPGAT: Gene essentiality prediction with graph attention networks publication-title: IEEE/ACM Trans Comput Biol Bioinform doi: 10.1109/TCBB.2021.3054738 – ident: ref10 doi: 10.1109/LSP.2016.2590470 – ident: ref19 doi: 10.1109/TITS.2022.3142248 – ident: ref13 doi: 10.1007/s11633-022-1352-1 – ident: ref15 doi: 10.3390/s20072106 – ident: ref2 doi: 10.1109/TNSRE.2022.3156387 – ident: ref8 doi: 10.1109/TAI.2021.3098253 |
SSID | ssj0008185 |
Score | 2.4264917 |
Snippet | Surface electromyography (sEMG) is a significant interaction signal in the fields of human-computer interaction and rehabilitation assessment, as it can be... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 1 |
SubjectTerms | Aggregates Decoding Electromyogram decoding Electromyography Feature extraction Gesture recognition graph neural network Graph neural networks hand gesture recognition Hospitals Labels Mathematical models Model accuracy multiple labels Muscles Rehabilitation |
Title | Decoding Electromyographic Signal with Multiple Labels for Hand Gesture Recognition |
URI | https://ieeexplore.ieee.org/document/10093127 https://www.proquest.com/docview/2809890509 |
Volume | 30 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR27TsMw0KKdYOCNKBTkgYUhqes4iT0iaKkQIERBYotsn4MqIEWlGeDrseME8RCIJfLgs5w7-16-B0IH3EiTakkCFUXWQAFCAnuLILCqNs0TLlORunzni8tkdMvO7uK7Olm9yoUxxlTBZyZ0w-otH6a6dK4ye8Ot_d2naQu1rOXmk7U-2K6TPD7AkASWxfLmTZKI3vn4KnRtwsOoEv_pFxlUNVX5wYkr8TJcQZfNxnxUyUNYzlWo377VbPz3zlfRcq1o4iN_MtbQginW0dKn8oMbaHxibU8nu_DAN8N5evUFrCcajyf3Dty5afFFHXSIz6WyohRbPRePZAH41P5IOTP4uolCmhab6HY4uDkeBXWThUBTQef2qwESInNJtIlJpDSovuCQMFAgIh4zy80hBpYanjDZNyyGHCDPGVVUAY22ULuYFmYbYaqliEUCuVGCUao5pLFd2-RSyjhlrIN6DdozXVcgd40wHrPKEiEis4TKHKGymlAddPgB8eyrb_wxd9Ph_dM8j_IO6jakzer7-ZJRTgQXrvbNzi9gu2jRre69LV3Uns9Ks2f1j7nar87dO-392F4 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR1NT9sw9ImVA9thgw20jg584LJDguvYiX2cGFCgrdAKErfI9nMQYksn1h62Xz87ThBsYuIS5eCX2O_Z78vvA2BPOu0Kq2lisswbKEhp4k8RJl7VZlUudaGKkO88meajS356Ja7aZPUmF8Y51wSfuTS8Nnf5OLfL4CrzJ9zb30NWvIBVL_jFMKZr3TPeIHtiiCFNPJOV3a0kVfvj2XkaGoWnWaMAFI-kUNNW5R9e3AiYozcw7aYW40pu0-XCpPb3X1Ubnz33dXjdqprkc9wbG7Di6rfw6kEBwncw--KtzyC9yGFsh_P9VyxhfWPJ7OY6gAdHLZm0YYdkrI0XpsRrumSkayTHfiHLO0e-dnFI83oTLo8OLw5GSdtmIbFMsYV_WsSc6kpT6wTNjEUzVBJzjgZVJgX3_BwF8sLJnOuh4wIrxKrizDCDLNuCXj2v3XsgzGolVI6VM4ozZiUWwn_bVVprUXDeh_0O7aVta5CHVhjfysYWoar0hCoDocqWUH34dA_xI9bf-M_YzYD3B-Miyvsw6Ehbtif0Z8kkVVKF6jcfngDbhbXRxWRcjk-mZ9vwMvwp-l4G0FvcLd1Hr40szE6zB_8AYDvbpw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Decoding+Electromyographic+Signal+With+Multiple+Labels+for+Hand+Gesture+Recognition&rft.jtitle=IEEE+signal+processing+letters&rft.au=Zou%2C+Yongxiang&rft.au=Cheng%2C+Long&rft.au=Han%2C+Lijun&rft.au=Li%2C+Zhengwei&rft.date=2023-01-01&rft.issn=1070-9908&rft.eissn=1558-2361&rft.volume=30&rft.spage=483&rft.epage=487&rft_id=info:doi/10.1109%2FLSP.2023.3264417&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_LSP_2023_3264417 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1070-9908&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1070-9908&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1070-9908&client=summon |