IEEE Access (Jan 2024)

A BiLSTM-Based Feature Fusion With CNN Model: Integrating Smartphone Sensor Data for Pedestrian Activity Recognition

  • Rana Sabah,
  • Meng Chun Lam,
  • Faizan Qamar,
  • B. B. Zaidan

DOI
https://doi.org/10.1109/ACCESS.2024.3468470
Journal volume & issue
Vol. 12
pp. 142957 – 142978

Abstract

Read online

Given the wide range of sensor applications, pedestrian activity recognition research using smartphone sensors has gained significant attention. Recognizing activities can yield valuable insights into a person’s actions and the context of the activities. This study proposed a bidirectional long short-term memory based on the feature fusion model with a convolutional neural network (BiLSTM-BFF with CNN) to integrate time and frequency domain features and CNN. The fused feature vector was used as input in the BiLSTM network. The BiLSTM-BFF with CNN model recognized 14 types of pedestrian activity. New pedestrian activity datasets were collected from smartphone sensors used by different types of people (men, women, children, pregnant women, people with limps) and activities (walking, fast walking, elevator up and down, step escalator up and down, walking with step escalator up and down, flat escalator up and down, walking with flat escalator up and down, upstairs and downstairs). The efficiency of the proposed BiLSTM-BFF with the CNN model was validated by conducting experiments using this new dataset. The proposed method demonstrated 95.35% accuracy in recognizing pedestrian activities. The results highlighted the superior accuracy of the proposed method compared to other methods.

Keywords