Journal of Imaging (Jan 2024)

ControlFace: Feature Disentangling for Controllable Face Swapping

  • Xuehai Zhang,
  • Wenbo Zhou,
  • Kunlin Liu,
  • Hao Tang,
  • Zhenyu Zhang,
  • Weiming Zhang,
  • Nenghai Yu

DOI
https://doi.org/10.3390/jimaging10010021
Journal volume & issue
Vol. 10, no. 1
p. 21

Abstract

Read online

Face swapping is an intriguing and intricate task in the field of computer vision. Currently, most mainstream face swapping methods employ face recognition models to extract identity features and inject them into the generation process. Nonetheless, such methods often struggle to effectively transfer identity information, which leads to generated results failing to achieve a high identity similarity to the source face. Furthermore, if we can accurately disentangle identity information, we can achieve controllable face swapping, thereby providing more choices to users. In pursuit of this goal, we propose a new face swapping framework (ControlFace) based on the disentanglement of identity information. We disentangle the structure and texture of the source face, encoding and characterizing them in the form of feature embeddings separately. According to the semantic level of each feature representation, we inject them into the corresponding feature mapper and fuse them adequately in the latent space of StyleGAN. Owing to such disentanglement of structure and texture, we are able to controllably transfer parts of the identity features. Extensive experiments and comparisons with state-of-the-art face swapping methods demonstrate the superiority of our face swapping framework in terms of transferring identity information, producing high-quality face images, and controllable face swapping.

Keywords