Back to Search Start Over

A Good View for Graph Contrastive Learning.

Authors :
Chen, Xueyuan
Li, Shangzhe
Source :
Entropy; Mar2024, Vol. 26 Issue 3, p208, 18p
Publication Year :
2024

Abstract

Due to the success observed in deep neural networks with contrastive learning, there has been a notable surge in research interest in graph contrastive learning, primarily attributed to its superior performance in graphs with limited labeled data. Within contrastive learning, the selection of a "view" dictates the information captured by the representation, thereby influencing the model's performance. However, assessing the quality of information in these views poses challenges, and determining what constitutes a good view remains unclear. This paper addresses this issue by establishing the definition of a good view through the application of graph information bottleneck and structural entropy theories. Based on theoretical insights, we introduce CtrlGCL, a novel method for achieving a beneficial view in graph contrastive learning through coding tree representation learning. Extensive experiments were conducted to ascertain the effectiveness of the proposed view in unsupervised and semi-supervised learning. In particular, our approach, via CtrlGCL-H, yields an average accuracy enhancement of 1.06% under unsupervised learning when compared to GCL. This improvement underscores the efficacy of our proposed method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
26
Issue :
3
Database :
Complementary Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
176302851
Full Text :
https://doi.org/10.3390/e26030208