SRCNN: Stacked-Residual Convolutional Neural Network for Improving Human Activity Classification Based on Micro-Doppler Signatures of FMCW Radar

Current methods for daily human activity classification primarily rely on optical images from cameras or wearable sensors. Despite their high detection reliability, camera-based approaches suffer from several drawbacks, such as low-light conditions, limited range, and privacy concerns. To address th...

Full description

Saved in:
Bibliographic Details
Published inJournal of Electromagnetic Engineering and Science Vol. 24; no. 4; pp. 358 - 369
Main Authors Nguyen, NgocBinh, Doan, Van-Sang, Pham, MinhNghia, Le, VanNhu
Format Journal Article
LanguageEnglish
Published 한국전자파학회JEES 31.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Current methods for daily human activity classification primarily rely on optical images from cameras or wearable sensors. Despite their high detection reliability, camera-based approaches suffer from several drawbacks, such as low-light conditions, limited range, and privacy concerns. To address these limitations, this article proposes the use of a frequency-modulated continuous wave radar sensor for activity recognition. A stacked-residual convolutional neural network (SRCNN) is introduced to classify daily human activities based on the micro-Doppler features of returned radar signals. The model employs a two-layer stacked-residual structure to reuse former features, thereby improving the classification accuracy. The model is fine-tuned with different hyperparameters to find a trade-off between classification accuracy and inference time. Evaluations are conducted through training and testing on both simulated and measured datasets. As a result, the SRCNN model with six stacked-residual blocks and 64 filters achieves the best performance, with accuracies exceeding 95% and 99% at 0 dB and 10 dB, respectively. Remarkably, the proposed model outperforms several state-of-the-art CNN models in terms of classification accuracy and execution time on the same datasets.
ISSN:2671-7255
2671-7263
DOI:10.26866/jees.2024.4.r.235