Back to Search
Start Over
Under the hood of transformer networks for trajectory forecasting.
- Source :
-
Pattern Recognition . Jun2023, Vol. 138, pN.PAG-N.PAG. 1p. - Publication Year :
- 2023
-
Abstract
- • First in-depth experimental study of Transformer Networks and BERT for trajectory forecasting. • Comparative evaluation on the modelling of individual human trajectories. • First exhaustive evaluation of input/output representations and problem formulations. • Focus on the capability of Transformers to predict multi-modal futures. • Analysis of the impact of intention and long-term forecasting. Transformer Networks have established themselves as the de-facto state-of-the-art for trajectory forecasting but there is currently no systematic study on their capability to model the motion patterns of people, without interactions with other individuals nor the social context. There is abundant literature on LSTMs, CNNs and GANs on this subject. However methods adopting Transformer techniques achieve great performances by complex models and a clear analysis of their adoption as plain sequence models is missing. This paper proposes the first in-depth study of Transformer Networks (TF) and the Bidirectional Transformers (BERT) for the forecasting of the individual motion of people, without bells and whistles. We conduct an exhaustive evaluation of the input/output representations, problem formulations and sequence modelling, including a novel analysis of their capability to predict multi-modal futures. Out of comparative evaluation on the ETH+UCY benchmark, both TF and BERT are top performers in predicting individual motions and remain within a narrow margin wrt more complex techniques, including both social interactions and scene contexts. Source code will be released for all conducted experiments. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00313203
- Volume :
- 138
- Database :
- Academic Search Index
- Journal :
- Pattern Recognition
- Publication Type :
- Academic Journal
- Accession number :
- 162256833
- Full Text :
- https://doi.org/10.1016/j.patcog.2023.109372