Real-Time Rehabilitation Tracking by Integration of Computer Vision and Virtual Hand

Rehabilitation is a critical component of healthcare services that often requires prolonged monitoring to achieve optimal outcomes. One of the major challenges in rehabilitation tracking is the necessity for patients to visit medical centers, which can be costly and time-consuming, especially for in...

Full description

Saved in:
Bibliographic Details
Published inIEEE International Conference on Rehabilitation Robotics Vol. 2025; pp. 1 - 6
Main Authors Ahmadpanah, Parham, Korivand, Soroush
Format Conference Proceeding Journal Article
LanguageEnglish
Published United States IEEE 01.05.2025
Subjects
Online AccessGet full text
ISSN1945-7901
1945-7901
DOI10.1109/ICORR66766.2025.11062975

Cover

Loading…
More Information
Summary:Rehabilitation is a critical component of healthcare services that often requires prolonged monitoring to achieve optimal outcomes. One of the major challenges in rehabilitation tracking is the necessity for patients to visit medical centers, which can be costly and time-consuming, especially for individuals in remote areas. Although telehealth solutions such as video conferencing have been introduced, they are predominantly qualitative and do not provide precise rehabilitation tracking. Consequently, there is a need for a robust, accessible, and accurate method for rehabilitation monitoring. To address this need, we introduce a quantified, easily accessible framework for extracting and analyzing human finger joint torques to enable virtual hand interaction. Hand landmarks from MediaPipe are integrated with the MANO hand model in PyBullet. The hand positions, rotations, and joint angles extracted from the landmarks are used as controller inputs to mirror human hand movements in a virtual environment. A PID controller was designed to accurately track joint angle trajectories, with its parameters optimized using genetic algorithms. The torques extracted from the metacarpophalangeal and proximal interphalangeal joints were validated, achieving mean squared errors of 4.346 \times 10^{-4}(~\mathrm{N}. \mathrm{m})^{2} and 1.506 \times 10^{-5}(~\mathrm{N}. \mathrm{m})^{2} , respectively. This quantification and accessibility of human joint dynamics information is highly desirable for tracking rehabilitation progress using exoskeleton robotics.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1945-7901
1945-7901
DOI:10.1109/ICORR66766.2025.11062975