Back to Search
Start Over
DRGN: a dynamically reconfigurable accelerator for graph neural networks.
- Source :
- Journal of Ambient Intelligence & Humanized Computing; Jul2023, Vol. 14 Issue 7, p8985-9000, 16p
- Publication Year :
- 2023
-
Abstract
- Graph neural networks (GNNs) have achieved great success in processing non-Euclidean geometric spatial data structures. However, the irregular memory access of aggregation and the power-law distribution of the real-world graph challenge the existing memory hierarchy and caching policy of CPUs and GPUs. Meanwhile, after the emergence of an increasing number of GNN algorithms, higher requirements have been established for the flexibility of the hardware architecture. In this work, we design a dynamically reconfigurable GNN accelerator (named DRGN) supporting multiple GNN algorithms. Specifically, we first propose a vertex reordering algorithm and an adjacency matrix compressing algorithm to improve the graph data locality. Furthermore, to improve bandwidth utilization and the reuse rate of node features, we proposed a dedicatedly designed prefetcher to significantly improve hit rate. Finally, we proposed a scheduling mechanism to assign tasks to PE units to address the issue of workload imbalance. The effectiveness of proposed DRGN accelerator was evaluated using three GNN algorithms, including PageRank, GCN, and GraphSage. Compared to the execution time of these three GNN algorithms on CPU, performing PageRank algorithm on DRGN can achieve speedup by 231×, the GCN algorithm can achieve speedup by 150× on DRGN, and the GraphSage algorithm can achieve speedup by 39× when executed on DRGN. Compared with state-of-the-art GNN accelerators, DRGN can achieve higher energy-efficiency under the condition of relative lower-end process. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 18685137
- Volume :
- 14
- Issue :
- 7
- Database :
- Complementary Index
- Journal :
- Journal of Ambient Intelligence & Humanized Computing
- Publication Type :
- Academic Journal
- Accession number :
- 164130952
- Full Text :
- https://doi.org/10.1007/s12652-022-04402-x