1. Perm: A Parametric Representation for Multi-Style 3D Hair Modeling
- Author
-
He, Chengan, Sun, Xin, Shu, Zhixin, Luan, Fujun, Pirk, Sören, Herrera, Jorge Alejandro Amador, Michels, Dominik L., Wang, Tuanfeng Y., Zhang, Meng, Rushmeier, Holly, and Zhou, Yi
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Graphics - Abstract
We present Perm, a learned parametric model of human 3D hair designed to facilitate various hair-related applications. Unlike previous work that jointly models the global hair shape and local strand details, we propose to disentangle them using a PCA-based strand representation in the frequency domain, thereby allowing more precise editing and output control. Specifically, we leverage our strand representation to fit and decompose hair geometry textures into low- to high-frequency hair structures. These decomposed textures are later parameterized with different generative models, emulating common stages in the hair modeling process. We conduct extensive experiments to validate the architecture design of \textsc{Perm}, and finally deploy the trained model as a generic prior to solve task-agnostic problems, further showcasing its flexibility and superiority in tasks such as 3D hair parameterization, hairstyle interpolation, single-view hair reconstruction, and hair-conditioned image generation. Our code, data, and supplemental can be found at our project page: https://cs.yale.edu/homes/che/projects/perm/, Comment: Project page: https://cs.yale.edu/homes/che/projects/perm/
- Published
- 2024