IEEE Access (Jan 2024)

Integrating Pretrained Encoders for Generalized Face Frontalization

  • Wonyoung Choi,
  • Gi Pyo Nam,
  • Junghyun Cho,
  • Ig-Jae Kim,
  • Hyeong-Seok Ko

DOI
https://doi.org/10.1109/ACCESS.2024.3377220
Journal volume & issue
Vol. 12
pp. 43530 – 43539

Abstract

Read online

In the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-trained face recognition networks. For that purpose, we propose the use of the feature attention transformation (FAT) module that effectively transfers the low-level facial features to the facial generator. On the other hand, in the hope of reducing the pre-trained encoder dependency, we attempt a new FAT module organization that accommodates the features from all pre-trained face recognition networks employed. This paper attempts evaluating the proposed work using the “independent critic” as well as “dependent critic”, which enables objective judgments. Experimental results show that the proposed method significantly improves the face frontalization performance and helps overcome the bias associated with each pre-trained face recognition network employed.

Keywords