网络与信息安全学报 (Jun 2023)

Gender forgery of faces by fusing wavelet shortcut connection generative adversarial network

  • Wanze CHEN,
  • Liqing HUANG,
  • Jiazhen CHEN,
  • Feng YE,
  • Tianqiang HUANG,
  • Haifeng LUO

Journal volume & issue
Vol. 9
pp. 150 – 160

Abstract

Read online

The mainstream methods in the field of facial attribute manipulation had the following two defects due to data and model architecture limitations.First, the bottleneck structure of the autoencoder model results in the loss of feature information, and the traditional method of continuously injected styles to the source domain features during the decoding process makes the generated image too referential to the target domain while losing the identity information and fine-grained details.Second, differences in facial attributes composition between images, such as gender, ethnicity, or age can cause variations in frequency domain information.And the current unsupervised training methods do not automatically adjust the proportion of source and target domain information in the style injection stage, resulting in artifacts in generated images.A facial gender forgery model based on generative adversarial networks and image-to-image translation techniques, namely fused wavelet shortcut connection generative adversarial network (WscGAN), was proposed to address the these issues.Shortcut connections were added to the autoencoder structure, and the outputs of different encoding stages were decomposed at the feature level by wavelet transform.Attention mechanism was employed to process them one by one, to dynamically change the proportion of source domain features at different frequencies in the decoding process.This model could complete forgery of facial images in terms of gender attributes.To verify the effectiveness of the model, it was conducted on the CelebA-HQ dataset and the FFHQ dataset.Compared with the existing optimal models, the method improves the FID and LPIPS indices by 5.4% and 11.2%, and by 1.8% and 6.7%, respectively.Furthermore, the effectiveness of the proposed method in improving the gender attribute conversion of facial images is fully demonstrated by the results based on qualitative visual comparisons.

Keywords