Back to Search Start Over

Temporally enhanced graph convolutional network for hand tracking from an egocentric camera.

Authors :
Cho, Woojin
Ha, Taewook
Jeon, Ikbeom
Jeon, Jinwoo
Kim, Tae-Kyun
Woo, Woontack
Source :
Virtual Reality; Sep2024, Vol. 28 Issue 3, p1-18, 18p
Publication Year :
2024

Abstract

We propose a robust 3D hand tracking system in various hand action environments, including hand-object interaction, which utilizes a single color image and a previous pose prediction as input. We observe that existing methods deterministically exploit temporal information in motion space, failing to address realistic diverse hand motions. Also, prior methods paid less attention to efficiency as well as robust performance, i.e., the balance issues between time and accuracy. The Temporally Enhanced Graph Convolutional Network (TE-GCN) utilizes a 2-stage framework to encode temporal information adaptively. The system establishes balance by adopting an adaptive GCN, which effectively learns the spatial dependency between hand mesh vertices. Furthermore, the system leverages the previous prediction by estimating the relevance across image features through the attention mechanism. The proposed method achieves state-of-the-art balanced performance on challenging benchmarks and demonstrates robust results on various hand motions in real scenes. Moreover, the hand tracking system is integrated into a recent HMD with an off-loading framework, achieving a real-time framerate while maintaining high performance. Our study improves the usability of a high-performance hand-tracking method, which can be generalized to other algorithms and contributes to the usage of HMD in everyday life. Our code with the HMD project will be available at . [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13594338
Volume :
28
Issue :
3
Database :
Complementary Index
Journal :
Virtual Reality
Publication Type :
Academic Journal
Accession number :
178794633
Full Text :
https://doi.org/10.1007/s10055-024-01039-3