Novel Muscle Sensing by Radiomyography (RMG) and Its Application to Hand Gesture Recognition

Conventional electromyography (EMG) measures the continuous neural activity during muscle contraction, but lacks explicit quantification of the actual contraction. Mechanomyography (MMG) and accelerometers only measure body surface motion, while ultrasound, CT-scan and MRI are restricted to in-clini...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 23; no. 17; p. 1
Main Authors Zhang, Zijing, Kan, Edwin C.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.09.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Conventional electromyography (EMG) measures the continuous neural activity during muscle contraction, but lacks explicit quantification of the actual contraction. Mechanomyography (MMG) and accelerometers only measure body surface motion, while ultrasound, CT-scan and MRI are restricted to in-clinic snapshots. Here we propose a novel radiomyography (RMG) for continuous muscle actuation sensing that can be wearable or touchless, capturing both superficial and deep muscle groups. We verified RMG experimentally by a wearable forearm sensor for hand gesture recognition (HGR). We first converted the sensor outputs to the time-frequency spectrogram, and then employed the vision transformer (ViT) deep learning network as the classification model, which can recognize 23 gestures with an average accuracy up to 99% on 8 subjects. By transfer learning, high adaptivity to user difference and sensor variation were achieved at an average accuracy up to 97%. We further extended RMG to monitor eye and leg muscles and achieved high accuracy for eye movement and body posture tracking. RMG can be used with synchronous EMG to derive stimulation-actuation waveforms for many potential applications in kinesiology, physiotherapy, rehabilitation, and human-machine interface.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2023.3294329