1. Integrating Pretrained Encoders for Generalized Face Frontalization
- Author
-
Wonyoung Choi, Gi Pyo Nam, Junghyun Cho, Ig-Jae Kim, and Hyeong-Seok Ko
- Subjects
Face frontalization ,face pose normalization ,face recognition ,generative modeling ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
In the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-trained face recognition networks. For that purpose, we propose the use of the feature attention transformation (FAT) module that effectively transfers the low-level facial features to the facial generator. On the other hand, in the hope of reducing the pre-trained encoder dependency, we attempt a new FAT module organization that accommodates the features from all pre-trained face recognition networks employed. This paper attempts evaluating the proposed work using the “independent critic” as well as “dependent critic”, which enables objective judgments. Experimental results show that the proposed method significantly improves the face frontalization performance and helps overcome the bias associated with each pre-trained face recognition network employed.
- Published
- 2024
- Full Text
- View/download PDF