Advanced Intelligent Systems (Oct 2023)

Drone Detection and Tracking System Based on Fused Acoustical and Optical Approaches

  • Siyi Ding,
  • Xiao Guo,
  • Ti Peng,
  • Xiao Huang,
  • Xiaoping Hong

DOI
https://doi.org/10.1002/aisy.202300251
Journal volume & issue
Vol. 5, no. 10
pp. n/a – n/a

Abstract

Read online

The increasing popularity of small drones has stressed the urgent need for an effective drone‐oriented surveillance system that can work day and night. Herein, an acoustic and optical sensor‐fusion‐based system‐termed multimodal unmanned aerial vehicle 3D trajectory exposure system (MUTES) is presented to detect and track drone targets. MUTES combines multiple sensor modules including microphone array, camera, and lidar. The 64‐channel microphone array provides semispherical surveillance with high signal‐to‐noise ratio of sound source estimation, while the long‐range lidar and the telephoto camera are capable of subsequent precise target localization in a narrower but higher definition field of view. MUTES employs a coarse‐to‐fine, passive‐to‐active localization strategy for wide‐range detection (semispherical) and high‐precision 3D tracking. To further increase the fidelity, an environmental denoising model is trained, which helps to select valid acoustic features from a drone target, thus overcomes the drawbacks of the traditional sound source localization approaches when facing noise interference. The effectiveness of the proposed sensor‐fusion approach is validated through field experiments. To the best of the knowledge, MUTES provides the farthest detection range, highest 3D position accuracy, strong anti‐interference capability, and acceptable cost for unverified drone intruders.

Keywords