Across Sessions and Subjects Domain Adaptation for Building Robust Myoelectric Interface
Gesture interaction via surface electromyography (sEMG) signal is a promising approach for advanced human-computer interaction systems. However, improving the performance of the myoelectric interface is challenging due to the domain shift caused by the signal's inherent variability. To enhance...
Saved in:
Published in | IEEE transactions on neural systems and rehabilitation engineering Vol. 32; pp. 2005 - 2015 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Gesture interaction via surface electromyography (sEMG) signal is a promising approach for advanced human-computer interaction systems. However, improving the performance of the myoelectric interface is challenging due to the domain shift caused by the signal's inherent variability. To enhance the interface's robustness, we propose a novel adaptive information fusion neural network (AIFNN) framework, which could effectively reduce the effects of multiple scenarios. Specifically, domain adversarial training is established to inhibit the shared network's weights from exploiting domain-specific representation, thus allowing for the extraction of domain-invariant features. Effectively, classification loss, domain diversence loss and domain discrimination loss are employed, which improve classification performance while reduce distribution mismatches between the two domains. To simulate the application of myoelectric interface, experiments were carried out involving three scenarios (intra-session, inter-session and inter-subject scenarios). Ten non-disabled subjects were recruited to perform sixteen gestures for ten consecutive days. The experimental results indicated that the performance of AIFNN was better than two other state-of-the-art transfer learning approaches, namely fine-tuning (FT) and domain adversarial network (DANN). This study demonstrates the capability of AIFNN to maintain robustness over time and generalize across users in practical myoelectric interface implementations. These findings could serve as a foundation for future deployments. |
---|---|
AbstractList | Gesture interaction via surface electromyography (sEMG) signal is a promising approach for advanced human-computer interaction systems. However, improving the performance of the myoelectric interface is challenging due to the domain shift caused by the signal's inherent variability. To enhance the interface's robustness, we propose a novel adaptive information fusion neural network (AIFNN) framework, which could effectively reduce the effects of multiple scenarios. Specifically, domain adversarial training is established to inhibit the shared network's weights from exploiting domain-specific representation, thus allowing for the extraction of domain-invariant features. Effectively, classification loss, domain diversence loss and domain discrimination loss are employed, which improve classification performance while reduce distribution mismatches between the two domains. To simulate the application of myoelectric interface, experiments were carried out involving three scenarios (intra-session, inter-session and inter-subject scenarios). Ten non-disabled subjects were recruited to perform sixteen gestures for ten consecutive days. The experimental results indicated that the performance of AIFNN was better than two other state-of-the-art transfer learning approaches, namely fine-tuning (FT) and domain adversarial network (DANN). This study demonstrates the capability of AIFNN to maintain robustness over time and generalize across users in practical myoelectric interface implementations. These findings could serve as a foundation for future deployments. Gesture interaction via surface electromyography (sEMG) signal is a promising approach for advanced human-computer interaction systems. However, improving the performance of the myoelectric interface is challenging due to the domain shift caused by the signal's inherent variability. To enhance the interface's robustness, we propose a novel adaptive information fusion neural network (AIFNN) framework, which could effectively reduce the effects of multiple scenarios. Specifically, domain adversarial training is established to inhibit the shared network's weights from exploiting domain-specific representation, thus allowing for the extraction of domain-invariant features. Effectively, classification loss, domain diversence loss and domain discrimination loss are employed, which improve classification performance while reduce distribution mismatches between the two domains. To simulate the application of myoelectric interface, experiments were carried out involving three scenarios (intra-session, inter-session and inter-subject scenarios). Ten non-disabled subjects were recruited to perform sixteen gestures for ten consecutive days. The experimental results indicated that the performance of AIFNN was better than two other state-of-the-art transfer learning approaches, namely fine-tuning (FT) and domain adversarial network (DANN). This study demonstrates the capability of AIFNN to maintain robustness over time and generalize across users in practical myoelectric interface implementations. These findings could serve as a foundation for future deployments.Gesture interaction via surface electromyography (sEMG) signal is a promising approach for advanced human-computer interaction systems. However, improving the performance of the myoelectric interface is challenging due to the domain shift caused by the signal's inherent variability. To enhance the interface's robustness, we propose a novel adaptive information fusion neural network (AIFNN) framework, which could effectively reduce the effects of multiple scenarios. Specifically, domain adversarial training is established to inhibit the shared network's weights from exploiting domain-specific representation, thus allowing for the extraction of domain-invariant features. Effectively, classification loss, domain diversence loss and domain discrimination loss are employed, which improve classification performance while reduce distribution mismatches between the two domains. To simulate the application of myoelectric interface, experiments were carried out involving three scenarios (intra-session, inter-session and inter-subject scenarios). Ten non-disabled subjects were recruited to perform sixteen gestures for ten consecutive days. The experimental results indicated that the performance of AIFNN was better than two other state-of-the-art transfer learning approaches, namely fine-tuning (FT) and domain adversarial network (DANN). This study demonstrates the capability of AIFNN to maintain robustness over time and generalize across users in practical myoelectric interface implementations. These findings could serve as a foundation for future deployments. |
Author | Li, Wei Zhang, Xinran Shi, Ping Yu, Hongliu Li, Ping Li, Sujiao |
Author_xml | – sequence: 1 givenname: Wei orcidid: 0000-0002-4808-4659 surname: Li fullname: Li, Wei organization: Institute of Rehabilitation Engineering and Technology, University of Shanghai for Science and Technology, Shanghai, China – sequence: 2 givenname: Xinran orcidid: 0000-0002-0604-4771 surname: Zhang fullname: Zhang, Xinran organization: Institute of Rehabilitation Engineering and Technology, University of Shanghai for Science and Technology, Shanghai, China – sequence: 3 givenname: Ping orcidid: 0000-0001-7955-5567 surname: Shi fullname: Shi, Ping organization: Institute of Rehabilitation Engineering and Technology, University of Shanghai for Science and Technology, Shanghai, China – sequence: 4 givenname: Sujiao surname: Li fullname: Li, Sujiao organization: Institute of Rehabilitation Engineering and Technology, University of Shanghai for Science and Technology, Shanghai, China – sequence: 5 givenname: Ping orcidid: 0000-0003-0361-4895 surname: Li fullname: Li, Ping organization: Institute of Rehabilitation Engineering and Technology, University of Shanghai for Science and Technology, Shanghai, China – sequence: 6 givenname: Hongliu orcidid: 0000-0001-6886-5498 surname: Yu fullname: Yu, Hongliu email: yhl98@hotmail.com organization: Institute of Rehabilitation Engineering and Technology, University of Shanghai for Science and Technology, Shanghai, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/38147425$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kUtv1DAUhSNURB_wBxBCkdiwyeD3YzmUtoxUQOoUiZ3l2NeVR5l4sJNF_z2ZyRShLlhdX_s7R9f3nFcnfeqhqt5itMAY6U_339d3VwuCCF1QyiRn6EV1hjlXDSIYnezPlDWMEnRanZeyQQhLweWr6pQqzCQj_Kz6tXQ5lVKvoZSY-lLb3tfrsd2AG0r9JW1t7Oult7vBDtN7HVKuP4-x87F_qO9SO5ah_vaYoJv4HF296gfIwTp4Xb0Mtivw5lgvqp_XV_eXX5vbHzery-Vt45ggQ0MFsi0HwMwTHqxwTAqsmA9YWaEFx0p7EGCF1C3VDONWE-TBBSVbGXSgF9Vq9vXJbswux63NjybZaA4XKT8Ym4foOjBAFOJMU8WQZM4HOzWEI6o9DeBtO3l9nL12Of0eoQxmG4uDrrM9pLEYopGQknCtJ_TDM3STxtxPPzUUCcy4FopP1PsjNbZb8H_He9r_BKgZOKSQIRgX500P2cbOYGT2UZtD1GYftTlGPUnJM-mT-39F72ZRBIB_BFQyqjT9A7apsxk |
CODEN | ITNSB3 |
CitedBy_id | crossref_primary_10_1016_j_ish_2024_12_002 crossref_primary_10_1109_TNSRE_2025_3545818 |
Cites_doi | 10.1109/TMRB.2019.2957061 10.3390/s17030458 10.1088/1741-2552/acae0b 10.1016/j.medengphy.2015.02.005 10.1016/j.bspc.2016.08.017 10.1109/tnsre.2022.3178384 10.3389/fnbot.2016.00009 10.3390/bdcc2030021 10.1016/j.compbiomed.2020.104188 10.1109/TNSRE.2019.2896269 10.1109/JSEN.2021.3068521 10.1109/ACCESS.2019.2891350 10.1016/j.bspc.2020.101981 10.1109/TNSRE.2019.2962189 10.1016/j.bspc.2007.11.005 10.1109/JIOT.2022.3218739 10.3389/fnins.2021.657958 10.1109/TBME.2011.2177662 10.1186/1743-0003-11-22 10.1682/jrrd.2010.09.0177 10.1016/1050-6411(95)00015-1 10.1682/JRRD.2010.08.0149 10.3389/fnins.2017.00379 10.1109/TNSRE.2015.2492619 10.1109/TNSRE.2010.2100828 10.1109/TNSRE.2014.2305111 10.1038/sdata.2014.53 10.1016/j.bspc.2019.101572 10.1109/JIOT.2020.2979328 10.1109/TNSRE.2019.2946625 10.1007/11861898_36 10.1109/TNSRE.2021.3086401 10.1088/1741-2552/ab0e2e 10.3389/fnins.2021.621885 10.1109/JIOT.2021.3067382 10.1007/s00521-019-04553-7 10.1109/JIOT.2018.2856119 10.1109/TNSRE.2021.3073751 10.1109/TBME.2019.2899222 10.1109/ICARCV.2018.8581206 10.1016/j.patcog.2018.03.005 10.1109/ICCV.2015.463 10.1038/srep36571 10.1080/03093640600994581 10.3389/fbioe.2020.00158 10.1371/journal.pone.0203835 10.1109/tnsre.2017.2687520 10.1109/JBHI.2022.3159792 10.1109/ACCESS.2020.3027497 10.1109/SMC.2017.8122854 10.3389/fnbot.2021.699174 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 DOA |
DOI | 10.1109/TNSRE.2023.3347540 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005-present IEEE Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998-Present IEEE Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Ceramic Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Nursing & Allied Health Premium Biotechnology and BioEngineering Abstracts MEDLINE - Academic DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Materials Research Database Civil Engineering Abstracts Aluminium Industry Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Ceramic Abstracts Neurosciences Abstracts Materials Business File METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Aerospace Database Nursing & Allied Health Premium Engineered Materials Abstracts Biotechnology Research Abstracts Solid State and Superconductivity Abstracts Engineering Research Database Corrosion Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic MEDLINE Materials Research Database |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 4 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Occupational Therapy & Rehabilitation |
EISSN | 1558-0210 |
EndPage | 2015 |
ExternalDocumentID | oai_doaj_org_article_e280549384074cdfa54925039d3fedab 38147425 10_1109_TNSRE_2023_3347540 10374389 |
Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: National Key Research and Development Program of China grantid: 2020YFC2007902 funderid: 10.13039/501100012166 |
GroupedDBID | --- -~X 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E AAFWJ AAJGR AASAJ AAWTH ABAZT ABVLG ACGFO ACGFS ACIWK ACPRK AENEX AETIX AFPKN AFRAH AGSQL AIBXA ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD ESBDL F5P GROUPED_DOAJ HZ~ H~9 IFIPE IPLJI JAVBF LAI M43 O9- OCL OK1 P2P RIA RIE RNS AAYXX CITATION RIG CGR CUY CVF ECM EIF NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 |
ID | FETCH-LOGICAL-c462t-360ab5ee14d25fa6c476184df18a6965189de6ea679b39411b920decf87b7f9f3 |
IEDL.DBID | DOA |
ISSN | 1534-4320 1558-0210 |
IngestDate | Wed Aug 27 01:31:54 EDT 2025 Fri Jul 11 12:43:20 EDT 2025 Fri Jul 25 04:40:50 EDT 2025 Wed Feb 19 01:58:14 EST 2025 Tue Jul 01 00:43:29 EDT 2025 Thu Apr 24 23:09:09 EDT 2025 Wed Aug 27 02:05:10 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://creativecommons.org/licenses/by/4.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c462t-360ab5ee14d25fa6c476184df18a6965189de6ea679b39411b920decf87b7f9f3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-7955-5567 0000-0002-0604-4771 0000-0002-4808-4659 0000-0003-0361-4895 0000-0001-6886-5498 |
OpenAccessLink | https://doaj.org/article/e280549384074cdfa54925039d3fedab |
PMID | 38147425 |
PQID | 3061459685 |
PQPubID | 85423 |
PageCount | 11 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_e280549384074cdfa54925039d3fedab crossref_citationtrail_10_1109_TNSRE_2023_3347540 proquest_miscellaneous_2906772599 pubmed_primary_38147425 crossref_primary_10_1109_TNSRE_2023_3347540 proquest_journals_3061459685 ieee_primary_10374389 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20240000 2024-00-00 20240101 2024-01-01 |
PublicationDateYYYYMMDD | 2024-01-01 |
PublicationDate_xml | – year: 2024 text: 20240000 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on neural systems and rehabilitation engineering |
PublicationTitleAbbrev | TNSRE |
PublicationTitleAlternate | IEEE Trans Neural Syst Rehabil Eng |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref57 ref12 ref56 ref15 ref14 ref53 ref11 ref55 ref10 Shu (ref51) 2018 ref54 ref17 ref16 ref19 ref18 Long (ref49) ref50 ref46 ref45 ref42 ref41 ref44 ref43 ref8 ref7 ref9 ref4 ref3 Long (ref48) ref6 ref5 ref40 ref35 ref34 ref37 ref36 ref31 ref30 ref33 ref32 ref2 ref1 ref39 ref38 Ganin (ref47) 2015; 17 ref24 Müller (ref52); 32 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref27 ref29 |
References_xml | – ident: ref3 doi: 10.1109/TMRB.2019.2957061 – ident: ref26 doi: 10.3390/s17030458 – ident: ref23 doi: 10.1088/1741-2552/acae0b – ident: ref54 doi: 10.1016/j.medengphy.2015.02.005 – ident: ref57 doi: 10.1016/j.bspc.2016.08.017 – ident: ref44 doi: 10.1109/tnsre.2022.3178384 – ident: ref6 doi: 10.3389/fnbot.2016.00009 – ident: ref39 doi: 10.3390/bdcc2030021 – ident: ref53 doi: 10.1016/j.compbiomed.2020.104188 – ident: ref9 doi: 10.1109/TNSRE.2019.2896269 – ident: ref46 doi: 10.1109/JSEN.2021.3068521 – start-page: 2208 volume-title: Proc. Int. Conf. Mach. Learn. ident: ref49 article-title: Deep transfer learning with joint adaptation networks – ident: ref45 doi: 10.1109/ACCESS.2019.2891350 – ident: ref19 doi: 10.1016/j.bspc.2020.101981 – ident: ref12 doi: 10.1109/TNSRE.2019.2962189 – ident: ref11 doi: 10.1016/j.bspc.2007.11.005 – ident: ref22 doi: 10.1109/JIOT.2022.3218739 – volume: 32 start-page: 1 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref52 article-title: When does label smoothing help? – ident: ref30 doi: 10.3389/fnins.2021.657958 – ident: ref18 doi: 10.1109/TBME.2011.2177662 – ident: ref56 doi: 10.1186/1743-0003-11-22 – ident: ref4 doi: 10.1682/jrrd.2010.09.0177 – ident: ref20 doi: 10.1016/1050-6411(95)00015-1 – ident: ref36 doi: 10.1682/JRRD.2010.08.0149 – ident: ref24 doi: 10.3389/fnins.2017.00379 – ident: ref15 doi: 10.1109/TNSRE.2015.2492619 – ident: ref35 doi: 10.1109/TNSRE.2010.2100828 – ident: ref5 doi: 10.1109/TNSRE.2014.2305111 – ident: ref25 doi: 10.1038/sdata.2014.53 – ident: ref34 doi: 10.1016/j.bspc.2019.101572 – ident: ref2 doi: 10.1109/JIOT.2020.2979328 – ident: ref10 doi: 10.1109/TNSRE.2019.2946625 – ident: ref41 doi: 10.1007/11861898_36 – year: 2018 ident: ref51 article-title: A DIRT-T approach to unsupervised domain adaptation publication-title: arXiv:1802.08735 – ident: ref31 doi: 10.1109/TNSRE.2021.3086401 – ident: ref38 doi: 10.1088/1741-2552/ab0e2e – ident: ref40 doi: 10.3389/fnins.2021.621885 – ident: ref55 doi: 10.1109/JIOT.2021.3067382 – ident: ref17 doi: 10.1007/s00521-019-04553-7 – volume: 17 start-page: 2030 issue: 1 year: 2015 ident: ref47 article-title: Domain-adversarial training of neural networks publication-title: J. Mach. Learn. Res. – ident: ref1 doi: 10.1109/JIOT.2018.2856119 – ident: ref14 doi: 10.1016/j.bspc.2019.101572 – ident: ref42 doi: 10.1109/TNSRE.2021.3073751 – ident: ref7 doi: 10.1109/TBME.2019.2899222 – ident: ref32 doi: 10.1109/ICARCV.2018.8581206 – ident: ref27 doi: 10.1016/j.patcog.2018.03.005 – start-page: 136 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref48 article-title: Unsupervised domain adaptation with residual transfer networks – ident: ref50 doi: 10.1109/ICCV.2015.463 – ident: ref8 doi: 10.1038/srep36571 – ident: ref21 doi: 10.1080/03093640600994581 – ident: ref29 doi: 10.3389/fbioe.2020.00158 – ident: ref37 doi: 10.1371/journal.pone.0203835 – ident: ref43 doi: 10.1109/tnsre.2017.2687520 – ident: ref16 doi: 10.1109/JBHI.2022.3159792 – ident: ref28 doi: 10.1109/ACCESS.2020.3027497 – ident: ref33 doi: 10.1109/SMC.2017.8122854 – ident: ref13 doi: 10.3389/fnbot.2021.699174 |
SSID | ssj0017657 |
Score | 2.4148877 |
Snippet | Gesture interaction via surface electromyography (sEMG) signal is a promising approach for advanced human-computer interaction systems. However, improving the... |
SourceID | doaj proquest pubmed crossref ieee |
SourceType | Open Website Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 2005 |
SubjectTerms | Adaptation models Adult Algorithms Classification Data integration domain adaptation domain adversarial training Electromyography Electromyography - methods Female Gestures Healthy Volunteers Human-computer interface Humans Indexes Male Muscle, Skeletal - physiology Myoelectric interface Myoelectricity Neural networks Neural Networks, Computer Robustness surface electromyography Thumb Training Transfer learning User-Computer Interface Young Adult |
SummonAdditionalLinks | – databaseName: IEEE Electronic Library (IEL) dbid: RIE link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Jb9QwFLZoD6gXylIgUJCRgAvKNIm3-DiFVhVS5zCdSnOLHC9SVUgqkhzKr-fZTqIWqYhLlMVLrPfst9jvewh9rAUFscHBLKHUpiChdSpJ6VKjlWPagYIbsDvPV_zskn7fsu0YrB5iYay14fCZXfjbsJdvWj14V9mRj2nz2bp30A5YbjFYa94yEDzAesIMpiklRTZFyGTyaLO6WJ8sfKLwBSFUgJKyhx6DqKJgF7J7Aing9o-JVh7WOYPsOd1Hq-mv45GT68XQ1wv9-y9Ax_8e1lP0ZNRC8TKyzTP0yDbP0ae7iMN4E-EG8Ge8vgfm_QJtl2Ek-CJCenRYNQbDAuQ9Oh3-1v5UVw1eGnUTK2BQi_HxmH0br9t66Hp8ftvGBDxXGgenpFPaHqDL05PN17N0TNCQasqLPiU8UzWzNqemYE5xTYXPH2NcXiouOctLaSy3igtZE0nzvJZFZqx2paiFk468RLtN29jXCBeC6YwKuNYElAYthXGCKQcPLmOFTVA-UanS44B9Eo0fVbBiMlkFIleeyNVI5AR9mevcROyOf5Y-9sSfS3rc7fACaFWN07iyRQk6LnAz2MFUG6c8wB3LiDTEWaPqBB14-t7pLpI2QYcTL1XjItFVxFvjTPKSJejD_Bmmt9-zUY1th67yaPxgADEJTbyKPDg3PnHwmwc6fYv2YIA0OowO0W7_a7DvQIXq6_dh6vwBE3sThw priority: 102 providerName: IEEE |
Title | Across Sessions and Subjects Domain Adaptation for Building Robust Myoelectric Interface |
URI | https://ieeexplore.ieee.org/document/10374389 https://www.ncbi.nlm.nih.gov/pubmed/38147425 https://www.proquest.com/docview/3061459685 https://www.proquest.com/docview/2906772599 https://doaj.org/article/e280549384074cdfa54925039d3fedab |
Volume | 32 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Jb9QwFLZQT1wQS4FAqVyp5YLSJvEWH6fQqqrUHqZTaW6WV6kSJFUzc-Df87zMaDgAFy6RkniLn-23OP4-hI6NoKA2OLgllPoaNLStJelD7awOzAYwcBN2580tv7qn10u23KH6iv-EZXjg3HFnvuvBqoD84HlQ64KOkGKsIdKR4J02cfUFnbdxpsr-geAJ4xOmM60p6ZrNcZlGni1u7-YXp5E1_JQQKlLYY0clJeT-QrXyZ6szaZ_Ll-hFMRvxLDf3FXrmh9foZBciGC8yPgD-jOe_oW-_QctZqhjfZQyOCevBYVgxYghmwt_GH_phwDOnH3MGDHYsPi902Xg-mvW0wjc_x8yY82BxiiIGbf0-ur-8WHy9qgujQm0p71Y14Y02zPuWuo4FzS0VkfDFhbbXXHLW9tJ57jUX0hBJ29bIrnHehl4YEWQgb9HeMA7-PcKdYLahAq6GgJa3UrggmA5wExrW-Qq1m05VtnxwZL34rpLb0UiVBKGiIFQRRIW-bPM8ZrCNv6Y-j7LapoxA2ekBDB9Vho_61_Cp0H6U9E51RERC-AodbESvyqyeFInuM5O8ZxU62r6G-Rg3WfTgx_WkInw-eCxMQhHv8pDZFg7WERWwSH74Hy3_iJ5Db9AcDjpAe6untf8EBtLKHKa5cJjOMv4CHogIew |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Jb9QwFLagSNALawsDBYwEXFCmSbzFxym0GqAzh-lUmpvleJEqIKmYmQP8ep7tJGqRirhEWbzEes9-i_2-h9DbWlAQGxzMEkpdBhLaZJJUPrNGe2Y8KLgRu3M259Nz-mXFVl2weoyFcc7Fw2duHG7jXr5tzTa4yg5DTFvI1n0b3QHBz8oUrjVsGggegT1hDtOMkjLvY2Ryebicny2OxyFV-JgQKkBN2UV3QVhRsAzZNZEUkfu7VCs3a51R-pw8QPP-v9Ohk2_j7aYem99_QTr-98AeovudHooniXEeoVuueYzeXcUcxssEOIDf48U1OO8naDWJI8FnCdRjjXVjMSxBwaezxp_aH_qiwROrL1MFDIoxPuryb-NFW2_XGzz71aYUPBcGR7ek18btofOT4-XHadalaMgM5eUmIzzXNXOuoLZkXnNDRcggY31RaS45KyppHXeaC1kTSYuilmVunfGVqIWXnuyjnaZt3DOES8FMTgVcawJqg5HCesG0hwefs9KNUNFTSZluwCGNxncV7ZhcqkhkFYisOiKP0IehzmVC7_hn6aNA_KFkQN6OL4BWqpvIypUVaLnAz2AJU2O9DhB3LCfSEu-srkdoL9D3SneJtCN00POS6paJtSLBHmeSV2yE3gyfYYKHXRvduHa7VgGPH0wgJqGJp4kHh8Z7Dn5-Q6ev0b3pcnaqTj_Pv75AuzBYmtxHB2hn83PrXoJCtalfxWn0B5VyFtE |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Across+Sessions+and+Subjects+Domain+Adaptation+for+Building+Robust+Myoelectric+Interface&rft.jtitle=IEEE+transactions+on+neural+systems+and+rehabilitation+engineering&rft.au=Li%2C+Wei&rft.au=Zhang%2C+Xinran&rft.au=Shi%2C+Ping&rft.au=Li%2C+Sujiao&rft.date=2024&rft.issn=1534-4320&rft.eissn=1558-0210&rft.volume=32&rft.spage=2005&rft.epage=2015&rft_id=info:doi/10.1109%2FTNSRE.2023.3347540&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TNSRE_2023_3347540 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1534-4320&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1534-4320&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1534-4320&client=summon |