Back to Search Start Over

MCD-Net: Toward RGB-D Video Inpainting in Real-World Scenes.

Authors :
Hou J
Ji Z
Yang J
Wang C
Zheng F
Source :
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society [IEEE Trans Image Process] 2024; Vol. 33, pp. 1095-1108. Date of Electronic Publication: 2024 Feb 05.
Publication Year :
2024

Abstract

Video inpainting gains an increasing amount of attention ascribed to its wide applications in intelligent video editing. However, despite tremendous progress made in RGB video inpainting, the existing RGB-D video inpainting models are still incompetent to inpaint real-world RGB-D videos, as they simply fuse color and depth via explicit feature concatenation, neglecting the natural modality gap. Moreover, current RGB-D video inpainting datasets are synthesized with homogeneous and delusive RGB-D data, which is far from real-world application and cannot provide comprehensive evaluation. To alleviate these problems and achieve real-world RGB-D video inpainting, on one hand, we propose a Mutually-guided Color and Depth Inpainting Network (MCD-Net), where color and depth are reciprocally leveraged to inpaint each other implicitly, mitigating the modality gap and fully exploiting cross-modal association for inpainting. On the other hand, we build a Video Inpainting with Depth (VID) dataset to supply diverse and authentic RGB-D video data with various object annotation masks to enable comprehensive evaluation for RGB-D video inpainting under real-world scenes. Experimental results on the DynaFill benchmark and our collected VID dataset demonstrate our MCD-Net not only yields the state-of-the-art quantitative performance but successfully achieves high-quality RGB-D video inpainting under real-world scenes. All resources are available at https://github.com/JCATCV/MCD-Net.

Details

Language :
English
ISSN :
1941-0042
Volume :
33
Database :
MEDLINE
Journal :
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Publication Type :
Academic Journal
Accession number :
38294916
Full Text :
https://doi.org/10.1109/TIP.2024.3358675