Novel approach of time series prediction using SEMG on hand gestures for prosthetic control

To determine the purpose of a movement and assess the state of a muscle's function, one sort of bio-electrical signal i.e., electromyogram (EMG) signals can be used. There are several uses for EMG in motor control and neuromuscular physiology since it includes useful information about muscle ac...

Full description

Saved in:
Bibliographic Details
Published inSadhana (Bangalore) Vol. 50; no. 3
Main Authors Sharma, Tanu, Sharma, K P
Format Journal Article
LanguageEnglish
Published New Delhi Springer India 11.06.2025
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To determine the purpose of a movement and assess the state of a muscle's function, one sort of bio-electrical signal i.e., electromyogram (EMG) signals can be used. There are several uses for EMG in motor control and neuromuscular physiology since it includes useful information about muscle activity. Electromyography (EMG) signal processing and categorization are crucial for controlling prosthetic arms and other applications. Recently, new methods for converting time series data to graphs and complicated networks with applications in biomedical (prosthetics) have been introduced. This ongoing study (i.e., the recording of seven hand activities from two independent muscle locations with ten subjects) suggests a time-series surface electromyogram (SEMG) signal pattern categorization approach for prosthesis control. For the prediction of time series data, the linear auto-regressive integrated moving average (ARIMA) model is investigated and evaluated. The regular activities performed could be distinguished greatly from the best classification results regarding R-squared and RMSE parameters. The findings from these data sets demonstrate that the suggested model has appreciable predicted parameter values and can be used for prosthetic applications.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0973-7677
0256-2499
0973-7677
DOI:10.1007/s12046-025-02759-1