Back to Search Start Over

EgoBody: Human Body Shape and Motion of Interacting People from Head-Mounted Devices

Authors :
Zhang, Siwei
Ma, Qianli
Zhang, Yan
Qian, Zhiyin
Kwon, Taein
Pollefeys, Marc
Bogo, Federica
Tang, Siyu
Publication Year :
2021

Abstract

Understanding social interactions from egocentric views is crucial for many applications, ranging from assistive robotics to AR/VR. Key to reasoning about interactions is to understand the body pose and motion of the interaction partner from the egocentric view. However, research in this area is severely hindered by the lack of datasets. Existing datasets are limited in terms of either size, capture/annotation modalities, ground-truth quality, or interaction diversity. We fill this gap by proposing EgoBody, a novel large-scale dataset for human pose, shape and motion estimation from egocentric views, during interactions in complex 3D scenes. We employ Microsoft HoloLens2 headsets to record rich egocentric data streams (including RGB, depth, eye gaze, head and hand tracking). To obtain accurate 3D ground truth, we calibrate the headset with a multi-Kinect rig and fit expressive SMPL-X body meshes to multi-view RGB-D frames, reconstructing 3D human shapes and poses relative to the scene, over time. We collect 125 sequences, spanning diverse interaction scenarios, and propose the first benchmark for 3D full-body pose and shape estimation of the social partner from egocentric views. We extensively evaluate state-of-the-art methods, highlight their limitations in the egocentric scenario, and address such limitations leveraging our high-quality annotations. Data and code are available at https://sanweiliti.github.io/egobody/egobody.html.<br />Comment: Camera ready version for ECCV 2022, appendix included

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2112.07642
Document Type :
Working Paper