Back to Search Start Over

SMERF: Streamable Memory Efficient Radiance Fields for Real-Time Large-Scene Exploration

Authors :
Duckworth, Daniel
Hedman, Peter
Reiser, Christian
Zhizhin, Peter
Thibert, Jean-François
Lučić, Mario
Szeliski, Richard
Barron, Jonathan T.
Publication Year :
2023

Abstract

Recent techniques for real-time view synthesis have rapidly advanced in fidelity and speed, and modern methods are capable of rendering near-photorealistic scenes at interactive frame rates. At the same time, a tension has arisen between explicit scene representations amenable to rasterization and neural fields built on ray marching, with state-of-the-art instances of the latter surpassing the former in quality while being prohibitively expensive for real-time applications. In this work, we introduce SMERF, a view synthesis approach that achieves state-of-the-art accuracy among real-time methods on large scenes with footprints up to 300 m$^2$ at a volumetric resolution of 3.5 mm$^3$. Our method is built upon two primary contributions: a hierarchical model partitioning scheme, which increases model capacity while constraining compute and memory consumption, and a distillation training strategy that simultaneously yields high fidelity and internal consistency. Our approach enables full six degrees of freedom (6DOF) navigation within a web browser and renders in real-time on commodity smartphones and laptops. Extensive experiments show that our method exceeds the current state-of-the-art in real-time novel view synthesis by 0.78 dB on standard benchmarks and 1.78 dB on large scenes, renders frames three orders of magnitude faster than state-of-the-art radiance field models, and achieves real-time performance across a wide variety of commodity devices, including smartphones. We encourage readers to explore these models interactively at our project website: https://smerf-3d.github.io.<br />Comment: Camera Ready. Project website: https://smerf-3d.github.io

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.07541
Document Type :
Working Paper