Back to Search Start Over

Conditional-Flow NeRF: Accurate 3D modelling with reliable uncertainty quantification

Authors :
China Scholarship Council
Agencia Estatal de Investigación (España)
Ministerio de Ciencia, Innovación y Universidades (España)
NVIDIA Corporation
Shen, Jianxiong
Agudo, Antonio
Moreno-Noguer, Francesc
Ruiz Ovejero, Adrià
China Scholarship Council
Agencia Estatal de Investigación (España)
Ministerio de Ciencia, Innovación y Universidades (España)
NVIDIA Corporation
Shen, Jianxiong
Agudo, Antonio
Moreno-Noguer, Francesc
Ruiz Ovejero, Adrià
Publication Year :
2022

Abstract

A critical limitation of current methods based on Neural Radiance Fields (NeRF) is that they are unable to quantify the uncertainty associated with the learned appearance and geometry of the scene. This information is paramount in real applications such as medical diagnosis or autonomous driving where, to reduce potentially catastrophic failures, the confidence on the model outputs must be included into the decision-making process. In this context, we introduce Conditional-Flow NeRF (CF-NeRF), a novel probabilistic framework to incorporate uncertainty quantification into NeRF-based approaches. For this purpose, our method learns a distribution over all possible radiance fields modelling which is used to quantify the uncertainty associated with the modelled scene. In contrast to previous approaches enforcing strong constraints over the radiance field distribution, CF-NeRF learns it in a flexible and fully data-driven manner by coupling Latent Variable Modelling and Conditional Normalizing Flows. This strategy allows to obtain reliable uncertainty estimation while preserving model expressivity. Compared to previous state-of-the-art methods proposed for uncertainty quantification in NeRF, our experiments show that the proposed method achieves significantly lower prediction errors and more reliable uncertainty values for synthetic novel view and depth-map estimation.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1380452547
Document Type :
Electronic Resource