Mathematics (Nov 2024)

FLARE: A Backdoor Attack to Federated Learning with Refined Evasion

  • Qingya Wang,
  • Yi Wu,
  • Haojun Xuan,
  • Huishu Wu

DOI
https://doi.org/10.3390/math12233751
Journal volume & issue
Vol. 12, no. 23
p. 3751

Abstract

Read online

Federated Learning (FL) is vulnerable to backdoor attacks in which attackers inject malicious behaviors into the global model. To counter these attacks, existing works mainly introduce sophisticated defenses by analyzing model parameters and utilizing robust aggregation strategies. However, we find that FL systems can still be attacked by exploiting their inherent complexity. In this paper, we propose a novel three-stage backdoor attack strategy named FLARE: A Backdoor Attack to Federated Learning with Refined Evasion, which is designed to operate under the radar of conventional defense strategies. Our proposal begins with a trigger inspection stage to leverage the initial susceptibilities of FL systems, followed by a trigger insertion stage where the synthesized trigger is stealthily embedded at a low poisoning rate. Finally, the trigger is amplified to increase the attack’s success rate during the backdoor activation stage. Experiments on the effectiveness of FLARE show significant enhancements in both the stealthiness and success rate of backdoor attacks across multiple federated learning environments. In particular, the success rate of our backdoor attack can be improved by up to 45× compared to existing methods.

Keywords