IEEE Access (Jan 2023)

Manipulation of Age Variation Using StyleGAN Inversion and Fine-Tuning

  • Dongsik Yoon,
  • Jineui Kim,
  • Vincent Lorant,
  • Sungku Kang

DOI
https://doi.org/10.1109/ACCESS.2023.3336401
Journal volume & issue
Vol. 11
pp. 131475 – 131486

Abstract

Read online

Recent advancements in deep learning have yielded significant developments in age manipulation techniques in the field of computer vision. To handle this task, recent approaches using generative adversarial networks latent space transformation or image-to-image based techniques have been developed. However, such methods are limited in terms of preserving the facial identity of the subject and recovering background details during lifelong age variation. To address these limitations, this paper presents a novel framework to manipulate the age of subjects in photos. The proposed framework involves two main steps, i.e., age manipulation and StyleGAN fine-tuning. In the first step, the iterative ReStyle StyleGAN inversion technique discovers a latent vector that is most similar to the input image, and which is used to train an age manipulation encoder. In the second step, a StyleGAN fine-tuning process is used to reconstruct the details lost in the images synthesized using the StyleGAN generator during the age manipulation. To preserve substructures, e.g., backgrounds, we optimize the loss function using facial masks generated from the original and age-manipulated images. The proposed framework is compatible with various StyleGAN-based techniques, e.g., stylization and view synthesis. Compared with state-of-the-art methods, the proposed framework achieves reasonable manipulation and variation of the target age for real-world input images. The results demonstrate the effectiveness of the proposed method in preserving the facial identity and background details during lifelong age variation.

Keywords