Back to Search
Start Over
Feature-Oriented Sampling for Fast and Scalable GNN Training
- Publication Year :
- 2022
-
Abstract
- Recently Graph Neural Networks (GNNs) have achieved great success in many applications. To apply GNNs to large graphs, mini-batch training and sampling are widely adopted by recent works. However, existing works generate mini-batches following a topology-oriented sampling style, which first samples a subgraph and then fetches the corresponding node features to construct a mini-batch. This inevitably incurs intensive random access of graph data, the exponential growth of the batch size, and constrained candidates during sampling. In this work, we advocate adopting a feature-oriented sampling style which can overcome these drawbacks. We first sample the node features and then induce the corresponding subgraph to form a mini-batch. We apply the feature-oriented sampling method to three mainstream GNN models to demonstrate the effectiveness and efficiency of this sampling style. Experiments on four large-scale datasets show that feature-oriented sampling can achieve comparable accuracy as topology-oriented sampling while speeding up the training procedure by 2.2 7.9 times. © 2022 IEEE.
Details
- Database :
- OAIster
- Notes :
- English
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1376637971
- Document Type :
- Electronic Resource