Back to Search
Start Over
Generating natural pedestrian crowds by learning real crowd trajectories through a transformer-based GAN.
- Source :
-
Visual Computer . Apr2024, p1-18. - Publication Year :
- 2024
-
Abstract
- Traditional methods for constructing crowd simulations often have shortcomings in terms of realism, and data-driven methods are an effective approach to enhancing the visual realism of crowd simulation. However, existing work mainly constructs crowd simulations through prediction-based approaches based on deep learning or by fitting the parameters of traditional methods, which limits the expressiveness of the model. In response to these limitations, this paper introduces a method capable of generating realistic pedestrian crowds. This approach uses a Generative Adversarial Network, complemented by a transformer module, to learn behavioral patterns from actual crowd trajectories. We use a transformer module to extract trajectory features of the crowd, then convert the spatial relationships between individuals into sequences using a special data processing mechanism, and use the transformer module to extract social features of the crowd, while guiding the movement of each individual with their target direction. During training, we simultaneously learn from real crowd data and simulation data resolving collisions by traditional methods, to enhance the collision avoidance behavior of virtual crowds while maintaining the movement patterns of real crowds, resulting in more general collision avoidance behavior. The crowds generated by the model are not limited to specific scenarios and show generalization capabilities. Compared to other models, our method shows better performance on publicly available large-scale pedestrian datasets after training. Our code is publicly available at https://github.com/ydp91/NPCGAN. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 01782789
- Database :
- Academic Search Index
- Journal :
- Visual Computer
- Publication Type :
- Academic Journal
- Accession number :
- 176889019
- Full Text :
- https://doi.org/10.1007/s00371-024-03385-4