1. RDNeRF: relative depth guided NeRF for dense free view synthesis.
- Author
-
Qiu, Jiaxiong, Zhu, Yifan, Jiang, Peng-Tao, Cheng, Ming-Ming, and Ren, Bo
- Subjects
RADIANCE ,GEOMETRY - Abstract
In this paper, we focus on dense view synthesis with free movements in indoor scenes for better user interactions than sparse views. Neural radiance field (NeRF) handles sparsely and spherically captured scenes well, while it struggles in scenes with dense free views. We extend NeRF to handle these views of indoor scenes. We present a learning-based approach named relative depth guided NeRF (RDNeRF), which jointly renders RGB images and recovers scene geometry in dense free views. To recover the geometry of each view without the ground-truth depth, we propose to directly learn the relative depth by implicit functions and transform it as a geometric volume bound for geometry-aware sampling and integration of NeRF. With correct scene geometry, we further model the implicit internal relevance of inputs to enhance the representation ability of NeRF in dense free views. We conduct extensive experiments in indoor scenes for dense free view synthesis. RDNeRF outperforms current state-of-the-art methods and achieves 24.95 PSNR score and 0.77 SSIM score. Besides, it recovers more accurate geometry than basic models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF