Back to Search
Start Over
Accelerating Large-Scale Heterogeneous Interaction Graph Embedding Learning via Importance Sampling
- Source :
- ACM Transactions on Knowledge Discovery from Data. 15:1-23
- Publication Year :
- 2020
- Publisher :
- Association for Computing Machinery (ACM), 2020.
-
Abstract
- In real-world problems, heterogeneous entities are often related to each other through multiple interactions, forming a Heterogeneous Interaction Graph (HIG). While modeling HIGs to deal with fundamental tasks, graph neural networks present an attractive opportunity that can make full use of the heterogeneity and rich semantic information by aggregating and propagating information from different types of neighborhoods. However, learning on such complex graphs, often with millions or billions of nodes, edges, and various attributes, could suffer from expensive time cost and high memory consumption. In this article, we attempt to accelerate representation learning on large-scale HIGs by adopting the importance sampling of heterogeneous neighborhoods in a batch-wise manner, which naturally fits with most batch-based optimizations. Distinct from traditional homogeneous strategies neglecting semantic types of nodes and edges, to handle the rich heterogeneous semantics within HIGs, we devise both type-dependent and type-fusion samplers where the former respectively samples neighborhoods of each type and the latter jointly samples from candidates of all types. Furthermore, to overcome the imbalance between the down-sampled and the original information, we respectively propose heterogeneous estimators including the self-normalized and the adaptive estimators to improve the robustness of our sampling strategies. Finally, we evaluate the performance of our models for node classification and link prediction on five real-world datasets, respectively. The empirical results demonstrate that our approach performs significantly better than other state-of-the-art alternatives, and is able to reduce the number of edges in computation by up to 93%, the memory cost by up to 92% and the time cost by up to 86%.
- Subjects :
- Theoretical computer science
General Computer Science
Computer science
Graph embedding
Computation
Estimator
02 engineering and technology
Time cost
High memory
Homogeneous
020204 information systems
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Feature learning
Importance sampling
Subjects
Details
- ISSN :
- 1556472X and 15564681
- Volume :
- 15
- Database :
- OpenAIRE
- Journal :
- ACM Transactions on Knowledge Discovery from Data
- Accession number :
- edsair.doi...........8b640a3ac7fd6e447f1a21f8f483eee6
- Full Text :
- https://doi.org/10.1145/3418684