Back to Search Start Over

Total-Decom: Decomposed 3D Scene Reconstruction with Minimal Interaction

Authors :
Lyu, Xiaoyang
Chang, Chirui
Dai, Peng
Sun, Yang-Tian
Qi, Xiaojuan
Publication Year :
2024

Abstract

Scene reconstruction from multi-view images is a fundamental problem in computer vision and graphics. Recent neural implicit surface reconstruction methods have achieved high-quality results; however, editing and manipulating the 3D geometry of reconstructed scenes remains challenging due to the absence of naturally decomposed object entities and complex object/background compositions. In this paper, we present Total-Decom, a novel method for decomposed 3D reconstruction with minimal human interaction. Our approach seamlessly integrates the Segment Anything Model (SAM) with hybrid implicit-explicit neural surface representations and a mesh-based region-growing technique for accurate 3D object decomposition. Total-Decom requires minimal human annotations while providing users with real-time control over the granularity and quality of decomposition. We extensively evaluate our method on benchmark datasets and demonstrate its potential for downstream applications, such as animation and scene editing. The code is available at https://github.com/CVMI-Lab/Total-Decom.git.<br />Comment: 8 pages, 7 figures, accepted by CVPR 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.19314
Document Type :
Working Paper