网络与信息安全学报 (Apr 2024)

Survey of optical-based physical domain adversarial attacks and defense

  • Jinyin CHEN, Xiaoming ZHAO, Haibin ZHENG, Haifeng GUO

DOI
https://doi.org/10.11959/j.issn.2096-109x.2024026
Journal volume & issue
Vol. 10, no. 2
pp. 1 – 21

Abstract

Read online

Deep learning models are misled into making false predictions by adversarial attacks that implant tiny perturbations into the original input, which are imperceptible to the human eye. This poses a huge security threat to computer vision systems that are based on deep learning. Compared to digital-domain adversarial attacks, physical-domain adversarial attacks are enabled to introduce perturbations into the input before the adversarial input is captured by the acquisition device and converted into a binary image within the vision system, posing a real security threat to deep learning-based computer vision systems. Optical-based physical-domain adversarial attack techniques, such as those using projected irradiation as a typical example, are more likely to be overlooked and provided negligible protection due to their perturbations being very similar to effects produced by natural environments in the real world. Given their high degree of invisibility and executability, they could pose a significant or even fatal threat to real systems. Based on existing research work, the introduction and discussion of optical-based physical-domain adversarial attack techniques within computer vision systems were presented. The attack scenarios, tools, goals, and performances of these techniques were compared and analyzed. Potential future research directions for optical-based physical-domain adversarial attacks were also discussed.

Keywords