Back to Search
Start Over
MuNeRF: Robust Makeup Transfer in Neural Radiance Fields.
- Source :
-
IEEE transactions on visualization and computer graphics [IEEE Trans Vis Comput Graph] 2024 Feb 22; Vol. PP. Date of Electronic Publication: 2024 Feb 22. - Publication Year :
- 2024
- Publisher :
- Ahead of Print
-
Abstract
- There has been a high demand for facial makeup transfer tools in fashion e-commerce and virtual avatar generation. Most of the existing makeup transfer methods are based on the generative adversarial networks. Despite their success in makeup transfer for a single image, they struggle to maintain the consistency of makeup under different poses and expressions of the same person. In this paper, we propose a robust makeup transfer method which consistently transfers the makeup style of a reference image to facial images in any poses and expressions. Our method introduces the implicit 3D representation, neural radiance fields (NeRFs), to ensure the geometric and appearance consistency. It has two separate stages, including one basic NeRF module to reconstruct the geometry from the input facial image sequence, and a makeup module to learn how to transfer the reference makeup style consistently. We propose a novel hybrid makeup loss which is specially designed based on the makeup characteristics to supervise the training of the makeup module. The proposed loss significantly improves the visual quality and faithfulness of the makeup transfer effects. To better align the distribution between the transferred makeup and the reference makeup, a patch-based discriminator that works in the pose-independent UV texture space is proposed to provide more accurate control of the synthesized makeup. Extensive experiments and a user study demonstrate the superiority of our network for a variety of different makeup styles.
Details
- Language :
- English
- ISSN :
- 1941-0506
- Volume :
- PP
- Database :
- MEDLINE
- Journal :
- IEEE transactions on visualization and computer graphics
- Publication Type :
- Academic Journal
- Accession number :
- 38386584
- Full Text :
- https://doi.org/10.1109/TVCG.2024.3368443