Human joint motion data capture and fusion based on wearable sensors
The field of human motion data capture and fusion has a broad range of potential applications and market opportunities. The capture of human motion data for wearable sensors is less costly and more convenient than other methods, but it also suffers from poor data capture accuracy and high latency. C...
Saved in:
Published in | Autonomous intelligent systems Vol. 5; no. 1; p. 12 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Singapore
Springer Nature Singapore
01.12.2025
Springer Nature B.V Springer |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The field of human motion data capture and fusion has a broad range of potential applications and market opportunities. The capture of human motion data for wearable sensors is less costly and more convenient than other methods, but it also suffers from poor data capture accuracy and high latency. Consequently, in order to overcome the limitations of existing wearable sensors in data capture and fusion, the study initially constructed a model of the human joint and bone by combining the quaternion method and root bone human forward kinematics through mathematical modeling. Subsequently, the sensor data calibration was optimized, and the Madgwick algorithm was introduced to address the resulting issues. Finally, a novel human joint motion data capture and fusion model was proposed. The experimental results indicated that the maximum mean error and root mean square error of yaw angle of this new model were 1.21° and 1.17°, respectively. The mean error and root mean square error of pitch angle were maximum 1.24° and 1.19°, respectively. The maximum knee joint and elbow joint data capture errors were 3.8 and 6.1, respectively. The suggested approach, which offers a new path for technological advancement in this area, greatly enhances the precision and dependability of human motion capture, which has a broad variety of application possibilities. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2730-616X 2730-616X |
DOI: | 10.1007/s43684-025-00098-w |