Back to Search Start Over

3D hand pose and mesh estimation via a generic Topology-aware Transformer model

Authors :
Shaoqi Yu
Yintong Wang
Lili Chen
Xiaolin Zhang
Jiamao Li
Source :
Frontiers in Neurorobotics, Vol 18 (2024)
Publication Year :
2024
Publisher :
Frontiers Media S.A., 2024.

Abstract

In Human-Robot Interaction (HRI), accurate 3D hand pose and mesh estimation hold critical importance. However, inferring reasonable and accurate poses in severe self-occlusion and high self-similarity remains an inherent challenge. In order to alleviate the ambiguity caused by invisible and similar joints during HRI, we propose a new Topology-aware Transformer network named HandGCNFormer with depth image as input, incorporating prior knowledge of hand kinematic topology into the network while modeling long-range contextual information. Specifically, we propose a novel Graphformer decoder with an additional Node-offset Graph Convolutional layer (NoffGConv). The Graphformer decoder optimizes the synergy between the Transformer and GCN, capturing long-range dependencies and local topological connections between joints. On top of that, we replace the standard MLP prediction head with a novel Topology-aware head to better exploit local topological constraints for more reasonable and accurate poses. Our method achieves state-of-the-art 3D hand pose estimation performance on four challenging datasets, including Hands2017, NYU, ICVL, and MSRA. To further demonstrate the effectiveness and scalability of our proposed Graphformer Decoder and Topology aware head, we extend our framework to HandGCNFormer-Mesh for the 3D hand mesh estimation task. The extended framework efficiently integrates a shape regressor with the original Graphformer Decoder and Topology aware head, producing Mano parameters. The results on the HO-3D dataset, which contains various and challenging occlusions, show that our HandGCNFormer-Mesh achieves competitive results compared to previous state-of-the-art 3D hand mesh estimation methods.

Details

Language :
English
ISSN :
16625218
Volume :
18
Database :
Directory of Open Access Journals
Journal :
Frontiers in Neurorobotics
Publication Type :
Academic Journal
Accession number :
edsdoj.0f727cd86ee4795b1d4323f5389736e
Document Type :
article
Full Text :
https://doi.org/10.3389/fnbot.2024.1395652