Upper-limb prosthetic control using wearable multichannel mechanomyography

In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion of inertial measurement and mechanomyography (MMG), which measures the vibrations of muscle fibres during motion. MMG is immune to issues su...

Full description

Saved in:
Bibliographic Details
Published inIEEE International Conference on Rehabilitation Robotics Vol. 2017; pp. 1293 - 1298
Main Authors Wilson, Samuel, Vaidyanathan, Ravi
Format Conference Proceeding Journal Article
LanguageEnglish
Published United States IEEE 01.07.2017
Subjects
Online AccessGet full text
ISSN1945-7901
1945-7901
DOI10.1109/ICORR.2017.8009427

Cover

Abstract In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion of inertial measurement and mechanomyography (MMG), which measures the vibrations of muscle fibres during motion. MMG is immune to issues such as sweat, skin impedance, and the need for a reference signal that is common to electromyography (EMG). The main contributions of this work are: 1) the hardware design of a fused inertial and MMG measurement system that can be worn on the arm, 2) a unified algorithm for detection, segmentation, and classification of muscle movement corresponding to hand gestures, and 3) experiments demonstrating the real-time control of a commercial prosthetic hand (Bebionic Version 2). Results show recognition of seven gestures, achieving an offline classification accuracy of 83.5% performed on five healthy subjects and one transradial amputee. The gesture recognition was then tested in real time on subsets of two and five gestures, with an average accuracy of 93.3% and 62.2% respectively. To our knowledge this is the first applied MMG based control system for practical prosthetic control.
AbstractList In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion of inertial measurement and mechanomyography (MMG), which measures the vibrations of muscle fibres during motion. MMG is immune to issues such as sweat, skin impedance, and the need for a reference signal that is common to electromyography (EMG). The main contributions of this work are: 1) the hardware design of a fused inertial and MMG measurement system that can be worn on the arm, 2) a unified algorithm for detection, segmentation, and classification of muscle movement corresponding to hand gestures, and 3) experiments demonstrating the real-time control of a commercial prosthetic hand (Bebionic Version 2). Results show recognition of seven gestures, achieving an offline classification accuracy of 83.5% performed on five healthy subjects and one transradial amputee. The gesture recognition was then tested in real time on subsets of two and five gestures, with an average accuracy of 93.3% and 62.2% respectively. To our knowledge this is the first applied MMG based control system for practical prosthetic control.
In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion of inertial measurement and mechanomyography (MMG), which measures the vibrations of muscle fibres during motion. MMG is immune to issues such as sweat, skin impedance, and the need for a reference signal that is common to electromyography (EMG). The main contributions of this work are: 1) the hardware design of a fused inertial and MMG measurement system that can be worn on the arm, 2) a unified algorithm for detection, segmentation, and classification of muscle movement corresponding to hand gestures, and 3) experiments demonstrating the real-time control of a commercial prosthetic hand (Bebionic Version 2). Results show recognition of seven gestures, achieving an offline classification accuracy of 83.5% performed on five healthy subjects and one transradial amputee. The gesture recognition was then tested in real time on subsets of two and five gestures, with an average accuracy of 93.3% and 62.2% respectively. To our knowledge this is the first applied MMG based control system for practical prosthetic control.In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion of inertial measurement and mechanomyography (MMG), which measures the vibrations of muscle fibres during motion. MMG is immune to issues such as sweat, skin impedance, and the need for a reference signal that is common to electromyography (EMG). The main contributions of this work are: 1) the hardware design of a fused inertial and MMG measurement system that can be worn on the arm, 2) a unified algorithm for detection, segmentation, and classification of muscle movement corresponding to hand gestures, and 3) experiments demonstrating the real-time control of a commercial prosthetic hand (Bebionic Version 2). Results show recognition of seven gestures, achieving an offline classification accuracy of 83.5% performed on five healthy subjects and one transradial amputee. The gesture recognition was then tested in real time on subsets of two and five gestures, with an average accuracy of 93.3% and 62.2% respectively. To our knowledge this is the first applied MMG based control system for practical prosthetic control.
Author Vaidyanathan, Ravi
Wilson, Samuel
Author_xml – sequence: 1
  givenname: Samuel
  surname: Wilson
  fullname: Wilson, Samuel
  email: s.wilson14@imperial.ac.uk
  organization: Dept. of Mech. Eng., Imperial Coll. London, London, UK
– sequence: 2
  givenname: Ravi
  surname: Vaidyanathan
  fullname: Vaidyanathan, Ravi
  email: r.vaidyanathan@imperial.ac.uk
  organization: Dept. of Mech. Eng., Imperial Coll. London, London, UK
BackLink https://www.ncbi.nlm.nih.gov/pubmed/28813999$$D View this record in MEDLINE/PubMed
BookMark eNpNkFtLw0AQhVdR7MX-AQXJoy-pO7tJdudRipdKoVDsc9lNpm1kczGbIP33RlrFpznM-TicmRG7KKuSGLsBPgXg-DCfLVerqeCgpppzjIQ6YyOIpU6EwESesyFgFIcKOVz80wM28f6Dcw6iJ1VyxQZCa5CIOGRv67qmJnR5YYO6qXy7pzZPg7Qq26ZyQefzchd8kWmMdRQUnevdvSlLckFBP6oqDtWuMfX-cM0ut8Z5mpzmmK2fn95nr-Fi-TKfPS7CXEhoQ1TKJtxarlRmU0IJlFrUYIAnKNIMIuqXBHqLItbGZoLHNoFUQgQZJLEcs_tjbt_3syPfborcp-ScKanq_AZQ8khLgdCjdye0swVlm7rJC9McNr_398DtEciJ6M8-fVd-A5lja5Y
ContentType Conference Proceeding
Journal Article
DBID 6IE
6IL
CBEJK
RIE
RIL
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1109/ICORR.2017.8009427
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE

MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Occupational Therapy & Rehabilitation
EISBN 1538622963
9781538622964
EISSN 1945-7901
EndPage 1298
ExternalDocumentID 28813999
8009427
Genre orig-research
Journal Article
GroupedDBID 6IE
6IF
6IK
6IL
6IN
AAJGR
AAWTH
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IPLJI
OCL
RIE
RIL
RNS
CGR
CUY
CVF
ECM
EIF
NPM
7X8
ID FETCH-LOGICAL-i231t-977b60bb077dbce931ecb981a10692cd14ee93e18f9258abd205b61c3141d1653
IEDL.DBID RIE
ISSN 1945-7901
IngestDate Fri Jul 11 13:50:25 EDT 2025
Wed Feb 19 02:43:15 EST 2025
Wed Aug 27 02:58:40 EDT 2025
IsPeerReviewed false
IsScholarly true
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i231t-977b60bb077dbce931ecb981a10692cd14ee93e18f9258abd205b61c3141d1653
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PMID 28813999
PQID 1930483291
PQPubID 23479
PageCount 6
ParticipantIDs pubmed_primary_28813999
ieee_primary_8009427
proquest_miscellaneous_1930483291
PublicationCentury 2000
PublicationDate 2017-Jul
PublicationDateYYYYMMDD 2017-07-01
PublicationDate_xml – month: 07
  year: 2017
  text: 2017-Jul
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle IEEE International Conference on Rehabilitation Robotics
PublicationTitleAbbrev ICORR
PublicationTitleAlternate IEEE Int Conf Rehabil Robot
PublicationYear 2017
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0001286276
Score 2.196738
Snippet In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion...
SourceID proquest
pubmed
ieee
SourceType Aggregation Database
Index Database
Publisher
StartPage 1293
SubjectTerms Adult
Amputees - rehabilitation
Arm - physiology
Artificial Limbs
Control systems
Electromyography
Female
Gestures
Humans
Male
Middle Aged
Muscles
Myography - instrumentation
Myography - methods
Prosthetics
Robots
Signal Processing, Computer-Assisted - instrumentation
Skin
Vibrations
Young Adult
Title Upper-limb prosthetic control using wearable multichannel mechanomyography
URI https://ieeexplore.ieee.org/document/8009427
https://www.ncbi.nlm.nih.gov/pubmed/28813999
https://www.proquest.com/docview/1930483291
Volume 2017
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JSwMxGP2oPXlyadW6EUE9OW2TmclyLhYtuFAs9DZkc8F2WmqL6K83maUuKHgLAxlCvi_zvUneewE4xszVbU1kQAzlQURMGAgHm4NISEEVjZnMjguurunFIOoN42EFzpZaGGttRj6zTd_MzvLNRC_8VlmLex4cYSuw4tIs12p92U9x2JzRUhfTFq3Lzk2_78lbrFl09M6_nDvM451es8tU_saVWX3prsFVObKcVvLcXMxVU7__MG3879DXof6p5EO3yxq1ARWbbsLJV2thdJf7CqBT1P_m2l2D3mA6tbNg9DRWaOrVIY9e8YgKdjvylPkH9OqWipdfoYya6HXEqR2hsfWtyfitcMSuw6B7fte5CIq7F4Inh_jmgYOFiraVajNmlLYixFYrwbF0v5CCaIMj6x5azO8FiblUhrRjRbEOcYQNpnG4BdV0ktodQDTiMaeMaB2ZSMahdN8QzqQMGZdEmfsG1Px8JdPcXiMppqoBR2VoEpfy_hxDpnayeEkc5vRG-ETgBmznMVt2LuO7-_tL92DVJ0TOt92H6ny2sAcOVczVYZZOH-oByzE
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JSwMxGP1wOejJXesaQT05tclkspxFqUtVSgvehmxV0U6Ltoj-epOZaV1Q8BYGMoR8X-Z7k7z3ArCHua_bhqiIWCYiSmwcSQ-bIyqVZJolXOXHBY0rVm_T89vkdgIOx1oY51xOPnPV0MzP8m3PDMNW2ZEIPDjCJ2Ha132aFGqtLzsqHp1zNlLG1OTR2fF1sxnoW7xadg3ev0J41BO8XvPrVP5GlnmFOZ2DxmhsBbHksToc6Kp5_2Hb-N_Bz8Pyp5YP3Yyr1AJMuGwR9r-aC6NW4SyADlDzm2_3Epy3-333HD09dDXqB33IfdA8opLfjgJp_g69-sUSBFgoJycGJXHmnlDXhVav-1Z6Yi9D-_SkdVyPytsXogeP-QaRB4aa1bSucW61cTLGzmgpsPI_kZIYi6nzDx0WHUkSobQltUQzbGJMscUsiVdgKutlbg0QoyIRjBNjqKUqiZX_igiuVMyFItp2KrAU5ivtFwYbaTlVFdgdhSb1SR9OMlTmesOX1KPOYIVPJK7AahGzcedRfNd_f-kOzNRbjcv08uzqYgNmQ3IU7NtNmBo8D92WxxgDvZ2n1ge1Ks5-
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=proceeding&rft.title=IEEE+International+Conference+on+Rehabilitation+Robotics&rft.atitle=Upper-limb+prosthetic+control+using+wearable+multichannel+mechanomyography&rft.au=Wilson%2C+Samuel&rft.au=Vaidyanathan%2C+Ravi&rft.date=2017-07-01&rft.pub=IEEE&rft.eissn=1945-7901&rft.spage=1293&rft.epage=1298&rft_id=info:doi/10.1109%2FICORR.2017.8009427&rft_id=info%3Apmid%2F28813999&rft.externalDocID=8009427
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1945-7901&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1945-7901&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1945-7901&client=summon