Back to Search Start Over

Integrating Depth-Based and Deep Learning Techniques for Real-Time Video Matting without Green Screens.

Authors :
Su, Pin-Chen
Yang, Mau-Tsuen
Source :
Electronics (2079-9292); Aug2024, Vol. 13 Issue 16, p3182, 21p
Publication Year :
2024

Abstract

Virtual production, a filmmaking technique that seamlessly merges virtual and real cinematography, has revolutionized the film and television industry. However, traditional virtual production requires the setup of green screens, which can be both costly and cumbersome. We have developed a green screen-free virtual production system that incorporates a 3D tracker for camera tracking, enabling the compositing of virtual and real-world images from a moving camera with varying perspectives. To address the core issue of video matting in virtual production, we introduce a novel Boundary-Selective Fusion (BSF) technique that combines the alpha mattes generated by deep learning-based and depth-based approaches, leveraging their complementary strengths. Experimental results demonstrate that this combined alpha matte is more accurate and robust than those produced by either method alone. Overall, the proposed BSF technique is competitive with state-of-the-art video matting methods, particularly in scenarios involving humans holding objects or other complex settings. The proposed system enables real-time previewing of composite footage during filmmaking, reducing the costs associated with green screen setups and simplifying the compositing process of virtual and real images. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20799292
Volume :
13
Issue :
16
Database :
Complementary Index
Journal :
Electronics (2079-9292)
Publication Type :
Academic Journal
Accession number :
179382951
Full Text :
https://doi.org/10.3390/electronics13163182