Back to Search Start Over

Visually Plausible Human-Object Interaction Capture from Wearable Sensors

Authors :
Guzov, V.
Sattler, T.
Pons-Moll, G.
Publication Year :
2022

Abstract

In everyday lives, humans naturally modify the surrounding environmentthrough interactions, e.g., moving a chair to sit on it. To reproduce suchinteractions in virtual spaces (e.g., metaverse), we need to be able to captureand model them, including changes in the scene geometry, ideally fromego-centric input alone (head camera and body-worn inertial sensors). This isan extremely hard problem, especially since the object/scene might not bevisible from the head camera (e.g., a human not looking at a chair whilesitting down, or not looking at the door handle while opening a door). In thispaper, we present HOPS, the first method to capture interactions such asdragging objects and opening doors from ego-centric data alone. Central to ourmethod is reasoning about human-object interactions, allowing to track objectseven when they are not visible from the head camera. HOPS localizes andregisters both the human and the dynamic object in a pre-scanned static scene.HOPS is an important first step towards advanced AR/VR applications based onimmersive virtual universes, and can provide human-centric training data toteach machines to interact with their surroundings. The supplementary video,data, and code will be available on our project page athttp://virtualhumans.mpi-inf.mpg.de/hops/

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.od......1874..5cf19e0462e3fcde652b4adf69ca0447