Back to Search Start Over

Neural Scene Baking for Permutation Invariant Transparency Rendering with Real-time Global Illumination

Authors :
Zhang, Ziyang
Simo-Serra, Edgar
Publication Year :
2024

Abstract

Neural rendering provides a fundamentally new way to render photorealistic images. Similar to traditional light-baking methods, neural rendering utilizes neural networks to bake representations of scenes, materials, and lights into latent vectors learned from path-tracing ground truths. However, existing neural rendering algorithms typically use G-buffers to provide position, normal, and texture information of scenes, which are prone to occlusion by transparent surfaces, leading to distortions and loss of detail in the rendered images. To address this limitation, we propose a novel neural rendering pipeline that accurately renders the scene behind transparent surfaces with global illumination and variable scenes. Our method separates the G-buffers of opaque and transparent objects, retaining G-buffer information behind transparent objects. Additionally, to render the transparent objects with permutation invariance, we designed a new permutation-invariant neural blending function. We integrate our algorithm into an efficient custom renderer to achieve real-time performance. Our results show that our method is capable of rendering photorealistic images with variable scenes and viewpoints, accurately capturing complex transparent structures along with global illumination. Our renderer can achieve real-time performance ($256\times 256$ at 63 FPS and $512\times 512$ at 32 FPS) on scenes with multiple variable transparent objects.<br />Comment: This paper has been accepted by Computational Visual Media

Subjects

Subjects :
Computer Science - Graphics

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.19056
Document Type :
Working Paper