Back to Search Start Over

Distilling Token-Pruned Pose Transformer for 2D Human Pose Estimation

Authors :
Ren, Feixiang
Ren, Feixiang
Publication Year :
2023

Abstract

Human pose estimation has seen widespread use of transformer models in recent years. Pose transformers benefit from the self-attention map, which captures the correlation between human joint tokens and the image. However, training such models is computationally expensive. The recent token-Pruned Pose Transformer (PPT) solves this problem by pruning the background tokens of the image, which are usually less informative. However, although it improves efficiency, PPT inevitably leads to worse performance than TokenPose due to the pruning of tokens. To overcome this problem, we present a novel method called Distilling Pruned-Token Transformer for human pose estimation (DPPT). Our method leverages the output of a pre-trained TokenPose to supervise the learning process of PPT. We also establish connections between the internal structure of pose transformers and PPT, such as attention maps and joint features. Our experimental results on the MPII datasets show that our DPPT can significantly improve PCK compared to previous PPT models while still reducing computational complexity.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381617156
Document Type :
Electronic Resource