Back to Search Start Over

Neural Fields meet Explicit Geometric Representation for Inverse Rendering of Urban Scenes

Authors :
Wang, Zian
Shen, Tianchang
Gao, Jun
Huang, Shengyu
Munkberg, Jacob
Hasselgren, Jon
Gojcic, Zan
Chen, Wenzheng
Fidler, Sanja
Publication Year :
2023

Abstract

Reconstruction and intrinsic decomposition of scenes from captured imagery would enable many applications such as relighting and virtual object insertion. Recent NeRF based methods achieve impressive fidelity of 3D reconstruction, but bake the lighting and shadows into the radiance field, while mesh-based methods that facilitate intrinsic decomposition through differentiable rendering have not yet scaled to the complexity and scale of outdoor scenes. We present a novel inverse rendering framework for large urban scenes capable of jointly reconstructing the scene geometry, spatially-varying materials, and HDR lighting from a set of posed RGB images with optional depth. Specifically, we use a neural field to account for the primary rays, and use an explicit mesh (reconstructed from the underlying neural field) for modeling secondary rays that produce higher-order lighting effects such as cast shadows. By faithfully disentangling complex geometry and materials from lighting effects, our method enables photorealistic relighting with specular and shadow effects on several outdoor datasets. Moreover, it supports physics-based scene manipulations such as virtual object insertion with ray-traced shadow casting.<br />Comment: CVPR 2023. Project page: https://nv-tlabs.github.io/fegr/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2304.03266
Document Type :
Working Paper