Back to Search Start Over

Spatiotemporal Co-Attention Recurrent Neural Networks for Human-Skeleton Motion Prediction.

Authors :
Shu, Xiangbo
Zhang, Liyan
Qi, Guo-Jun
Liu, Wei
Tang, Jinhui
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Jun2022, Vol. 44 Issue 6, p3300-3315. 16p.
Publication Year :
2022

Abstract

Human motion prediction aims to generate future motions based on the observed human motions. Witnessing the success of Recurrent Neural Networks (RNN) in modeling sequential data, recent works utilize RNNs to model human-skeleton motions on the observed motion sequence and predict future human motions. However, these methods disregard the existence of the spatial coherence among joints and the temporal evolution among skeletons, which reflects the crucial characteristics of human motions in spatiotemporal space. To this end, we propose a novel Skeleton-Joint Co-Attention Recurrent Neural Networks (SC-RNN) to capture the spatial coherence among joints, and the temporal evolution among skeletons simultaneously on a skeleton-joint co-attention feature map in spatiotemporal space. First, a skeleton-joint feature map is constructed as the representation of the observed motion sequence. Second, we design a new Skeleton-Joint Co-Attention (SCA) mechanism to dynamically learn a skeleton-joint co-attention feature map of this skeleton-joint feature map, which can refine the useful observed motion information to predict one future motion. Third, a variant of GRU embedded with SCA collaboratively models the human-skeleton motion and human-joint motion in spatiotemporal space by regarding the skeleton-joint co-attention feature map as the motion context. Experimental results of human motion prediction demonstrate that the proposed method outperforms the competing methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
44
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
156742201
Full Text :
https://doi.org/10.1109/TPAMI.2021.3050918