1. A trend graph attention network for traffic prediction.
- Author
-
Wang, Chu, Tian, Ran, Hu, Jia, and Ma, Zhongyu
- Subjects
- *
FORECASTING , *KNOWLEDGE transfer , *SPATIO-temporal variation , *HETEROGENEITY , *LOGICAL prediction ,TRAVEL planning - Abstract
• Fine-grained modeling of temporal heterogeneity and spatial heterogeneity. • Trend spatial attention module models spatial heterogeneity. • Pyramidal attention models temporal heterogeneity and long-term dependence. • Trend construction module introduces local and global trend blocks. • TGAN achieves state-of-the-art performance on multiple datasets. Traffic prediction is an important part of urban computing. Accurate traffic prediction assists the public in planning travel routes and relevant departments in traffic management, thus improving the efficiency of people's travel. Existing approaches usually use graph neural networks or attention mechanisms to capture the spatial–temporal correlation of traffic data, neglecting to model the spatial heterogeneity and temporal heterogeneity in traffic data at a fine-grained level, which leads to biased prediction results. To address the above challenges, we propose a Trend Graph Attention Network (TGAN) to perform traffic prediction tasks. Specifically, we designed a trend spatial attention module, which constructs the spatial graph structure in the form of a trend-to-trend. Its main idea is to transfer information between nodes with similar attributes to solve the problem of spatial heterogeneity. For modeling the long-term temporal dependence, we introduce a trend construction module to build local and global trend blocks and perform aggregation operations between time steps and trend blocks so that each time step shares local and global fields. Lastly, we perform direct interaction between future and historical data to generate multi-step prediction results at once. Experimental results on five datasets for two types of traffic prediction tasks show that TGAN outperforms the state-of-the-art baseline. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF