Fall detection method based on Spatio-temporal feature fusion using combined two-channel classification
Nowadays, the growing population of senior citizens is a challenge for almost all developing countries. New technologies can help monitor elderlies at home by providing an innovative and secure environment and further enhancing their quality of living. Vision-based systems offer promising results in...
Saved in:
Published in | Multimedia tools and applications Vol. 81; no. 18; pp. 26081 - 26100 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.07.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Nowadays, the growing population of senior citizens is a challenge for almost all developing countries. New technologies can help monitor elderlies at home by providing an innovative and secure environment and further enhancing their quality of living. Vision-based systems offer promising results in analyzing human posture and detecting abnormal events like falls. Falls appear to possess the most considerable risk for seniors living alone. In this article, a new fall detection method is proposed based on a fusion of motion-based and human shape-based features. Motion History Images (MHI) represent the temporal feature in our approach. Simultaneously, the height-to-width ratio and centroid of the moving person represent the spatial features. A two-channel classification model is designed using a threshold-based and a keyframe-based approach. The two channels are further combined based on any classification disparity for which more information is used to classify between falls and daily activities. Keyframes are selected based on the displacement of the spatial features having a threshold higher than a preset value. Keyframes are subject to a K-NN classification. The proposed algorithm delivers promising results on the UR fall detection dataset’s simulated fall and daily activity sequences. It provides satisfactory performance compared to existing state-of-the-art methods and shows a peak accuracy of 98.6% and recall of 100% in detecting falls. Specificity and precision are over 96%. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-022-11914-3 |