A BiLSTM-Based Feature Fusion With CNN Model: Integrating Smartphone Sensor Data for Pedestrian Activity Recognition

Given the wide range of sensor applications, pedestrian activity recognition research using smartphone sensors has gained significant attention. Recognizing activities can yield valuable insights into a person's actions and the context of the activities. This study proposed a bidirectional long...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 12; pp. 142957 - 142978
Main Authors Sabah, Rana, Lam, Meng Chun, Qamar, Faizan, Zaidan, B. B.
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Given the wide range of sensor applications, pedestrian activity recognition research using smartphone sensors has gained significant attention. Recognizing activities can yield valuable insights into a person's actions and the context of the activities. This study proposed a bidirectional long short-term memory based on the feature fusion model with a convolutional neural network (BiLSTM-BFF with CNN) to integrate time and frequency domain features and CNN. The fused feature vector was used as input in the BiLSTM network. The BiLSTM-BFF with CNN model recognized 14 types of pedestrian activity. New pedestrian activity datasets were collected from smartphone sensors used by different types of people (men, women, children, pregnant women, people with limps) and activities (walking, fast walking, elevator up and down, step escalator up and down, walking with step escalator up and down, flat escalator up and down, walking with flat escalator up and down, upstairs and downstairs). The efficiency of the proposed BiLSTM-BFF with the CNN model was validated by conducting experiments using this new dataset. The proposed method demonstrated 95.35% accuracy in recognizing pedestrian activities. The results highlighted the superior accuracy of the proposed method compared to other methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3468470