Symmetry (Sep 2024)

Reflective Adversarial Attacks against Pedestrian Detection Systems for Vehicles at Night

  • Yuanwan Chen,
  • Yalun Wu,
  • Xiaoshu Cui,
  • Qiong Li,
  • Jiqiang Liu,
  • Wenjia Niu

DOI
https://doi.org/10.3390/sym16101262
Journal volume & issue
Vol. 16, no. 10
p. 1262

Abstract

Read online

The advancements in deep learning have significantly enhanced the accuracy and robustness of pedestrian detection. However, recent studies reveal that adversarial attacks can exploit the vulnerabilities of deep learning models to mislead detection systems. These attacks are effective not only in digital environments but also pose significant threats to the reliability of pedestrian detection systems in the physical world. Existing adversarial attacks targeting pedestrian detection primarily focus on daytime scenarios and are easily noticeable by road observers. In this paper, we propose a novel adversarial attack method against vehicle–pedestrian detection systems at night. Our approach utilizes reflective optical materials that can effectively reflect light back to its source. We optimize the placement of these reflective patches using the particle swarm optimization (PSO) algorithm and deploy patches that blend with the color of pedestrian clothing in real-world scenarios. These patches remain inconspicuous during the day or under low-light conditions, but at night, the reflected light from vehicle headlights effectively disrupts the vehicle’s pedestrian detection systems. Considering that real-world detection models are often black-box systems, we propose a “symmetry” strategy, which involves using the behavior of an alternative model to simulate the response of the target model to adversarial patches. We generate adversarial examples using YOLOv5 and apply our attack to various types of pedestrian detection models. Experiments demonstrate that our approach is both effective and broadly applicable.

Keywords