Attention-based Residual BiLSTM Networks for Human Activity Recognition

Human activity recognition (HAR) commonly employs wearable sensors to identify and analyze the time series data collected by them, enabling the recognition of specific actions. However, the current fusion of convolutional and recurrent neural networks in existing approaches encounters difficulties w...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 11; p. 1
Main Authors Zhang, Junjie, Liu, Yuanhao, Yuan, Hua
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Human activity recognition (HAR) commonly employs wearable sensors to identify and analyze the time series data collected by them, enabling the recognition of specific actions. However, the current fusion of convolutional and recurrent neural networks in existing approaches encounters difficulties when it comes to differentiating between similar actions. To enhance the recognition accuracy of similar actions, we suggest integrating the residual structure and layer normalization into a bidirectional long short-term memory network (BLSTM). This integration enhances the network's feature extraction capabilities, introduces an attention mechanism to optimize the final feature information, and ultimately improves the accuracy and stability of activity recognition. To validate the effectiveness of our approach, we extensively tested it on three public datasets: UCI-HAR, WISDM, and KU-HAR. The results were highly encouraging, achieving remarkable overall recognition accuracies of 98.37%, 99.01%, and 97.89% for the respective datasets. The experimental results demonstrate that this method effectively enhances the recognition accuracy of similar behaviors. A codebase implementing the described framework is available at: https://github.com/lyh0625/1DCNN-ResBLSTM-Attention.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3310269