Remote Sensing (Nov 2024)

A Reparameterization Feature Redundancy Extract Network for Unmanned Aerial Vehicles Detection

  • Shijie Zhang,
  • Xu Yang,
  • Chao Geng,
  • Xinyang Li

DOI
https://doi.org/10.3390/rs16224226
Journal volume & issue
Vol. 16, no. 22
p. 4226

Abstract

Read online

In unmanned aerial vehicles (UAVs) detection, challenges such as occlusion, complex backgrounds, motion blur, and inference time often lead to false detections and missed detections. General object detection frameworks encounter difficulties in adequately tackling these challenges, leading to substantial information loss during network downsampling, inadequate feature fusion, and being unable to meet real-time requirements. In this paper, we propose a Real-Time Small Object Detection YOLO (RTSOD-YOLO) model to tackle the various challenges faced in UAVs detection. We further enhance the adaptive nature of the Adown module by incorporating an adaptive spatial attention mechanism. This mechanism processes the downsampled feature maps, enabling the model to better focus on key regions. Secondly, to address the issue of insufficient feature fusion, we employ combined serial and parallel triple feature encoding (TFE). This approach fuses scale-sequence features from both shallow features and twice-encoded features, resulting in a new small-scale object detection layer. While enhancing the global context awareness of the existing detection layers, this also enriches the small-scale object detection layer with detailed information. Since rich redundant features often ensure a comprehensive understanding of the input, which is a key characteristic of deep neural networks, we propose a more efficient redundant feature generation module. This module generates more feature maps with fewer parameters. Additionally, we introduce reparameterization techniques to compensate for potential feature loss while further improving the model’s inference speed. Experimental results demonstrate that our proposed RTSOD-YOLO achieves superior detection performance, with mAP50/mAP50:95 reaching 97.3%/51.7%, which represents improvement of 3%/3.5% over YOLOv8, and 2.6%/0.1% higher than YOLOv10. Additionally, it has the lowest parameter count and FLOPs, making it highly efficient in terms of computational resources.

Keywords