Back to Search Start Over

HMD$^2$: Environment-aware Motion Generation from Single Egocentric Head-Mounted Device

Authors :
Guzov, Vladimir
Jiang, Yifeng
Hong, Fangzhou
Pons-Moll, Gerard
Newcombe, Richard
Liu, C. Karen
Ye, Yuting
Ma, Lingni
Publication Year :
2024

Abstract

This paper investigates the online generation of realistic full-body human motion using a single head-mounted device with an outward-facing color camera and the ability to perform visual SLAM. Given the inherent ambiguity of this setup, we introduce a novel system, HMD$^2$, designed to balance between motion reconstruction and generation. From a reconstruction standpoint, our system aims to maximally utilize the camera streams to produce both analytical and learned features, including head motion, SLAM point cloud, and image embeddings. On the generative front, HMD$^2$ employs a multi-modal conditional motion Diffusion model, incorporating a time-series backbone to maintain temporal coherence in generated motions, and utilizes autoregressive in-painting to facilitate online motion inference with minimal latency (0.17 seconds). Collectively, we demonstrate that our system offers a highly effective and robust solution capable of scaling to an extensive dataset of over 200 hours collected in a wide range of complex indoor and outdoor environments using publicly available smart glasses.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.13426
Document Type :
Working Paper