Back to Search Start Over

Learning Socio-Temporal Graphs for Multi-Agent Trajectory Prediction

Authors :
Li, Yuke
Chen, Lixiong
Chen, Guangyi
Chan, Ching-Yao
Zhang, Kun
Anzellotti, Stefano
Wei, Donglai
Publication Year :
2023

Abstract

In order to predict a pedestrian's trajectory in a crowd accurately, one has to take into account her/his underlying socio-temporal interactions with other pedestrians consistently. Unlike existing work that represents the relevant information separately, partially, or implicitly, we propose a complete representation for it to be fully and explicitly captured and analyzed. In particular, we introduce a Directed Acyclic Graph-based structure, which we term Socio-Temporal Graph (STG), to explicitly capture pair-wise socio-temporal interactions among a group of people across both space and time. Our model is built on a time-varying generative process, whose latent variables determine the structure of the STGs. We design an attention-based model named STGformer that affords an end-to-end pipeline to learn the structure of the STGs for trajectory prediction. Our solution achieves overall state-of-the-art prediction accuracy in two large-scale benchmark datasets. Our analysis shows that a person's past trajectory is critical for predicting another person's future path. Our model learns this relationship with a strong notion of socio-temporal localities. Statistics show that utilizing this information explicitly for prediction yields a noticeable performance gain with respect to the trajectory-only approaches.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.14373
Document Type :
Working Paper