Back to Search Start Over

Self-Adaptive Graph With Nonlocal Attention Network for Skeleton-Based Action Recognition.

Authors :
Pang C
Gao X
Chen Z
Lyu L
Source :
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2023 Sep 13; Vol. PP. Date of Electronic Publication: 2023 Sep 13.
Publication Year :
2023
Publisher :
Ahead of Print

Abstract

Graph convolutional networks (GCNs) have achieved encouraging progress in modeling human body skeletons as spatial-temporal graphs. However, existing methods still suffer from two inherent drawbacks. Firstly, these models process the input data based on the physical structure of the human body, which leads to some latent correlations among joints being ignored. Furthermore, the key temporal relationships between nonadjacent frames are overlooked, preventing to fully learn the changes of the body joints along the temporal dimension. To address these issues, we propose an innovative spatial-temporal model by introducing a self-adaptive GCN (SAGCN) with global attention network, collectively termed SAGGAN. Specifically, the SAGCN module is proposed to construct two additional dynamic topological graphs to learn the common characteristics of all data and represent a unique pattern for each sample, respectively. Meanwhile, the global attention module (spatial attention (SA) and temporal attention (TA) modules) is designed to extract the global connections between different joints in a single frame and model temporal relationships between adjacent and nonadjacent frames in temporal sequences. In this manner, our network can capture richer features of actions for accurate action recognition and overcome the defect of the standard graph convolution. Extensive experiments on three benchmark datasets (NTU-60, NTU-120, and Kinetics) have demonstrated the superiority of our proposed method.

Details

Language :
English
ISSN :
2162-2388
Volume :
PP
Database :
MEDLINE
Journal :
IEEE transactions on neural networks and learning systems
Publication Type :
Academic Journal
Accession number :
37703156
Full Text :
https://doi.org/10.1109/TNNLS.2023.3298950