IEEE Access (Jan 2023)
Motion Pattern-Based Scene Classification Using Adaptive Synthetic Oversampling and Fully Connected Deep Neural Network
Abstract
Analyzing crowded environments has become an increasingly researched topic in computer vision community, largely due to its myriad practical applications, including enhanced video surveillance systems and the estimation of crowd density in specific settings. This paper presents a comprehensive method for progressing the study of crowd dynamics and behavioral analysis, specifically focusing on the classification of movement patterns. We introduce a specialized neural network-based classifier explicitly designed for the accurate categorization of various crowd scenes. This classifier fills a unique niche in the existing literature by offering robust and adaptive classification capabilities. To optimize the performance of our model, we conduct an in-depth analysis of loss functions commonly employed in multi-class classification tasks. Our study encompasses four widely used loss functions: Focal Loss, Huber Loss, Cross-Entropy Loss, and Multi-Margin Loss. Based on empirical findings, we introduce a Joint Loss function that combines the strengths of Cross-Entropy and Multi-Margin Loss, outperforming existing methods across key performance metrics such as accuracy, precision, recall, and F1-score. Furthermore, we address the critical challenge of class imbalance in motion patterns within crowd scenes. To this end, we perform a comprehensive comparative study of two leading oversampling techniques: the synthetic minority oversampling technique (SMOTE) and adaptive synthetic sampling (ADASYN). Our results indicate that ADASYN is superior at enhancing classification performance. This approach not only mitigates the issue of class imbalance but also provides robust empirical validation for our proposed method. Finally, we subject our model to a rigorous evaluation using the Collective Motion Database, facilitating a comprehensive comparison with existing state-of-the-art techniques. This evaluation confirms the effectiveness of our model and aligns it with established paradigms in the field.
Keywords