Back to Search
Start Over
Real-time volumetric rendering of dynamic humans
- Publication Year :
- 2023
-
Abstract
- We present a method for fast 3D reconstruction and real-time rendering of dynamic humans from monocular videos with accompanying parametric body fits. Our method can reconstruct a dynamic human in less than 3h using a single GPU, compared to recent state-of-the-art alternatives that take up to 72h. These speedups are obtained by using a lightweight deformation model solely based on linear blend skinning, and an efficient factorized volumetric representation for modeling the shape and color of the person in canonical pose. Moreover, we propose a novel local ray marching rendering which, by exploiting standard GPU hardware and without any baking or conversion of the radiance field, allows visualizing the neural human on a mobile VR device at 40 frames per second with minimal loss of visual quality. Our experimental evaluation shows superior or competitive results with state-of-the art methods while obtaining large training speedup, using a simple model, and achieving real-time rendering.<br />Comment: Project page: https://real-time-humans.github.io/
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2303.11898
- Document Type :
- Working Paper