Back to Search Start Over

SplatFormer: Point Transformer for Robust 3D Gaussian Splatting

Authors :
Chen, Yutong
Mihajlovic, Marko
Chen, Xiyi
Wang, Yiming
Prokudin, Sergey
Tang, Siyu
Publication Year :
2024

Abstract

3D Gaussian Splatting (3DGS) has recently transformed photorealistic reconstruction, achieving high visual fidelity and real-time performance. However, rendering quality significantly deteriorates when test views deviate from the camera angles used during training, posing a major challenge for applications in immersive free-viewpoint rendering and navigation. In this work, we conduct a comprehensive evaluation of 3DGS and related novel view synthesis methods under out-of-distribution (OOD) test camera scenarios. By creating diverse test cases with synthetic and real-world datasets, we demonstrate that most existing methods, including those incorporating various regularization techniques and data-driven priors, struggle to generalize effectively to OOD views. To address this limitation, we introduce SplatFormer, the first point transformer model specifically designed to operate on Gaussian splats. SplatFormer takes as input an initial 3DGS set optimized under limited training views and refines it in a single forward pass, effectively removing potential artifacts in OOD test views. To our knowledge, this is the first successful application of point transformers directly on 3DGS sets, surpassing the limitations of previous multi-scene training methods, which could handle only a restricted number of input views during inference. Our model significantly improves rendering quality under extreme novel views, achieving state-of-the-art performance in these challenging scenarios and outperforming various 3DGS regularization techniques, multi-scene models tailored for sparse view synthesis, and diffusion-based frameworks.<br />Comment: Code and dataset: https://github.com/ChenYutongTHU/SplatFormer Project page: https://sergeyprokudin.github.io/splatformer/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.06390
Document Type :
Working Paper