Vision-Based Finger Tapping Test in Patients With Parkinson's Disease via Spatial-Temporal 3D Hand Pose Estimation

Finger tapping test is crucial for diagnosing Parkinson's Disease (PD), but manual visual evaluations can result in score discrepancy due to clinicians' subjectivity. Moreover, applying wearable sensors requires making physical contact and may hinder PD patient's raw movement patterns...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of biomedical and health informatics Vol. 26; no. 8; pp. 3848 - 3859
Main Authors Guo, Zhilin, Zeng, Weiqi, Yu, Taidong, Xu, Yan, Xiao, Yang, Cao, Xuebing, Cao, Zhiguo
Format Journal Article
LanguageEnglish
Published United States IEEE 01.08.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Finger tapping test is crucial for diagnosing Parkinson's Disease (PD), but manual visual evaluations can result in score discrepancy due to clinicians' subjectivity. Moreover, applying wearable sensors requires making physical contact and may hinder PD patient's raw movement patterns. Accordingly, a novel computer-vision approach is proposed using depth camera and spatial-temporal 3D hand pose estimation to capture and evaluate PD patients' 3D hand movement. Within this approach, a temporal encoding module is leveraged to extend A2J's deep learning framework to counter the pose jittering problem, and a pose refinement process is utilized to alleviate dependency on massive data. Additionally, the first vision-based 3D PD hand dataset of 112 hand samples from 48 PD patients and 11 control subjects is constructed, fully annotated by qualified physicians under clinical settings. Testing on this real-world data, this new model achieves 81.2% classification accuracy, even surpassing that of individual clinicians in comparison, fully demonstrating this proposition's effectiveness. The demo video can be accessed at https://github.com/ZhilinGuo/ST-A2J .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2168-2194
2168-2208
2168-2208
DOI:10.1109/JBHI.2022.3162386