Alexandria Engineering Journal (Nov 2022)

Generating De-identification facial images based on the attention models and adversarial examples

  • Jingjing Yang,
  • Weijia Zhang,
  • Jiaxing Liu,
  • Jinzhao Wu,
  • Jie Yang

Journal volume & issue
Vol. 61, no. 11
pp. 8417 – 8429

Abstract

Read online

In response to the problem that facial features are likely to cause privacy leakage and identity theft of the user, this paper presents a method for generating De-identification facial images based on attention models and adversarial examples. The method has two training stages. In Stage 1, target classification networks are used as an attention module to extract Refined Features of a facial image to perform feature fusion and obtain the face feature fusion matrix. In Stage 2, the face feature fusion matrix is used as an initial data distribution of the generator in the adversarial generation network and adds pixel-level constraint loss functions and loss functions for cutting adversarial example perturbations. The purpose of this is for ensuring the generated private image of the face is usable and add attention modules to discriminator of generative adversarial network. Next, the method extracts the feature matrix of the de-identity face image and trains it to mimic the face feature fusion matrix to increase the migration of the adversarial response in the de-identity face image. Experimental results have shown that the performance of the proposed method is superior to or approach state-of-the-art methods in terms of image quality and robustness.

Keywords