Back to Search Start Over

Multiplex Graph Contrastive Learning with Soft Negatives

Authors :
Zhao, Zhenhao
Zhu, Minhong
Wang, Chen
Wang, Sijia
Zhang, Jiqiang
Chen, Li
Cai, Weiran
Publication Year :
2024

Abstract

Graph Contrastive Learning (GCL) seeks to learn nodal or graph representations that contain maximal consistent information from graph-structured data. While node-level contrasting modes are dominating, some efforts commence to explore consistency across different scales. Yet, they tend to lose consistent information and be contaminated by disturbing features. Here, we introduce MUX-GCL, a novel cross-scale contrastive learning paradigm that utilizes multiplex representations as effective patches. While this learning mode minimizes contaminating noises, a commensurate contrasting strategy using positional affinities further avoids information loss by correcting false negative pairs across scales. Extensive downstream experiments demonstrate that MUX-GCL yields multiple state-of-the-art results on public datasets. Our theoretical analysis further guarantees the new objective function as a stricter lower bound of mutual information of raw input features and output embeddings, which rationalizes this paradigm. Code is available at https://github.com/MUX-GCL/Code.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.08010
Document Type :
Working Paper