Back to Search Start Over

SpATr: MoCap 3D Human Action Recognition based on Spiral Auto-encoder and Transformer Network

Authors :
Bouzid, Hamza
Ballihi, Lahoucine
Publication Year :
2023
Publisher :
arXiv, 2023.

Abstract

Recent advancements in technology have expanded the possibilities of human action recognition by leveraging 3D data, which offers a richer representation of actions through the inclusion of depth information, enabling more accurate analysis of spatial and temporal characteristics. However, 3D human action recognition is a challenging task due to the irregularity and Disarrangement of the data points in action sequences. In this context, we present our novel model for human action recognition from fixed topology mesh sequences based on Spiral Auto-encoder and Transformer Network, namely SpATr. The proposed method first disentangles space and time in the mesh sequences. Then, an auto-encoder is utilized to extract spatial geometrical features, and tiny transformer is used to capture the temporal evolution of the sequence. Previous methods either use 2D depth images, sample skeletons points or they require a huge amount of memory leading to the ability to process short sequences only. In this work, we show competitive recognition rate and high memory efficiency by building our auto-encoder based on spiral convolutions, which are light weight convolution directly applied to mesh data with fixed topologies, and by modeling temporal evolution using a attention, that can handle large sequences. The proposed method is evaluated on on two 3D human action datasets: MoVi and BMLrub from the Archive of Motion Capture As Surface Shapes (AMASS). The results analysis shows the effectiveness of our method in 3D human action recognition while maintaining high memory efficiency. The code will soon be made publicly available.<br />Comment: 10 pages, 5 figures, Submitted IVC

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....86c948e7515ea2ed688fc1dd3584db44
Full Text :
https://doi.org/10.48550/arxiv.2306.17574