CapsFall: Fall Detection Using Ultra-Wideband Radar and Capsule Network
Radar technology for at home health-care has many advantages such as safety, reliability, privacy-preserving, and contact-less sensing nature. Detecting falls using radar has recently gained attention in smart health care. In this paper, CapsFall, a new method for fall detection using an ultra-wideb...
Saved in:
Published in | IEEE access Vol. 7; pp. 55336 - 55343 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Radar technology for at home health-care has many advantages such as safety, reliability, privacy-preserving, and contact-less sensing nature. Detecting falls using radar has recently gained attention in smart health care. In this paper, CapsFall, a new method for fall detection using an ultra-wideband radar that leverages the recent deep learning advances is proposed. To this end, a radar time series is derived from the radar back-scattered matrix and its time-frequency representation is obtained and used as input to the capsule network for automatic feature learning. In contrast to other existing methods, the proposed CapsFall method relies on multi-level feature learning from radar time-frequency representations. In particular, the proposed method utilizes a capsule network for automating feature learning and enhancing model discriminability. The experiments are conducted using a set of radar signals collected from ten subjects when performing various activities in a room environment. The performance of the proposed CapsFall method is evaluated in terms of classification metrics and compared with those of the other existing methods based on convolutional neural network, multi-layer perceptron, decision tree, and support vector machine. The results show that the proposed CapsFall method outperforms the other methods in terms of accuracy, precision, sensitivity, and specificity values. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2907925 |