1. PNeSM: Arbitrary 3D Scene Stylization via Prompt-Based Neural Style Mapping
- Author
-
Chen, Jiafu, Xing, Wei, Sun, Jiakai, Chu, Tianyi, Huang, Yiling, Ji, Boyan, Zhao, Lei, Lin, Huaizhong, Chen, Haibo, Wang, Zhizhong, Chen, Jiafu, Xing, Wei, Sun, Jiakai, Chu, Tianyi, Huang, Yiling, Ji, Boyan, Zhao, Lei, Lin, Huaizhong, Chen, Haibo, and Wang, Zhizhong
- Abstract
3D scene stylization refers to transform the appearance of a 3D scene to match a given style image, ensuring that images rendered from different viewpoints exhibit the same style as the given style image, while maintaining the 3D consistency of the stylized scene. Several existing methods have obtained impressive results in stylizing 3D scenes. However, the models proposed by these methods need to be re-trained when applied to a new scene. In other words, their models are coupled with a specific scene and cannot adapt to arbitrary other scenes. To address this issue, we propose a novel 3D scene stylization framework to transfer an arbitrary style to an arbitrary scene, without any style-related or scene-related re-training. Concretely, we first map the appearance of the 3D scene into a 2D style pattern space, which realizes complete disentanglement of the geometry and appearance of the 3D scene and makes our model be generalized to arbitrary 3D scenes. Then we stylize the appearance of the 3D scene in the 2D style pattern space via a prompt-based 2D stylization algorithm. Experimental results demonstrate that our proposed framework is superior to SOTA methods in both visual quality and generalization., Comment: Accepted to AAAI 2024
- Published
- 2024