1. Estimating Body and Hand Motion in an Ego-sensed World
- Author
-
Yi, Brent, Ye, Vickie, Zheng, Maya, Li, Yunqi, Müller, Lea, Pavlakos, Georgios, Ma, Yi, Malik, Jitendra, and Kanazawa, Angjoo
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
We present EgoAllo, a system for human motion estimation from a head-mounted device. Using only egocentric SLAM poses and images, EgoAllo guides sampling from a conditional diffusion model to estimate 3D body pose, height, and hand parameters that capture a device wearer's actions in the allocentric coordinate frame of the scene. To achieve this, our key insight is in representation: we propose spatial and temporal invariance criteria for improving model performance, from which we derive a head motion conditioning parameterization that improves estimation by up to 18%. We also show how the bodies estimated by our system can improve hand estimation: the resulting kinematic and temporal constraints can reduce world-frame errors in single-frame estimates by 40%. Project page: https://egoallo.github.io/, Comment: Project page: https://egoallo.github.io/
- Published
- 2024