IEEE Access (Jan 2024)

Multimodal Abnormal Event Detection in Public Transportation

  • Dimitris Tsiktsiris,
  • Antonios Lalas,
  • Minas Dasygenis,
  • Konstantinos Votis

DOI
https://doi.org/10.1109/ACCESS.2024.3425308
Journal volume & issue
Vol. 12
pp. 133469 – 133480

Abstract

Read online

This work addresses the growing concerns about security and passenger safety on public transportation. With the increasing demand for public transport and the rise in road traffic injuries, there is a need for advanced safety measures. Our paper proposes a multi-modal abnormal event detection framework that uses deep learning models in several modalities (RGB, depth, and audio) to identify abnormal events and petty crimes, including passenger aggression, petty thefts, and vandalism. The proposed detection system is designed for use inside autonomous vehicles and, thus, aims to function in the absence of a bus driver. As a result, its main goal is to enhance the safety and security of passengers during transportation. The system methodology involves a deep learning architecture that operates at different framerates and employs multimodal feature extraction and fusion. The experiments for this work were conducted on a custom multi-modal dataset, which includes various classes, including bagsnatch, falldown, fighting, normal, and vandalism, and showcased exceptional detection results compared to other existing action recognition models, with a total accuracy of 85.1%. Finally, the study concludes that the proposed system can be installed in autonomous vehicles and significantly improve safety measures on public transportation.

Keywords