Back to Search Start Over

Self-supervised contrastive graph representation with node and graph augmentation.

Authors :
Duan H
Xie C
Li B
Tang P
Source :
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2023 Oct; Vol. 167, pp. 223-232. Date of Electronic Publication: 2023 Aug 24.
Publication Year :
2023

Abstract

Graph representation is a critical technology in the field of knowledge engineering and knowledge-based applications since most knowledge bases are represented in the graph structure. Nowadays, contrastive learning has become a prominent way for graph representation by contrasting positive-positive and positive-negative node pairs between two augmentation graphs. It has achieved new state-of-the-art in the field of self-supervised graph representation. However, existing contrastive graph representation methods mainly focus on modifying (normally removing some edges/nodes) the original graph structure to generate the augmentation graph for the contrastive. It inevitably changes the original graph structures, meaning the generated augmentation graph is no longer equivalent to the original graph. This harms the performance of the representation in many structure-sensitive graphs such as protein graphs, chemical graphs, molecular graphs, etc. Moreover, there is only one positive-positive node pair but relatively massive positive-negative node pairs in the self-supervised graph contrastive learning. This can lead to the same class, or very similar samples are considered negative samples. To this end, in this work, we propose a Virtual Masking Augmentation (VMA) to generate an augmentation graph without changing any structures from the original graph. Meanwhile, a node augmentation method is proposed to augment the positive node pairs by discovering the most similar nodes in the same graph. Then, two different augmentation graphs are generated and put into a contrastive learning model to learn the graph representation. Extensive experiments on massive datasets demonstrate that our method achieves new state-of-the-art results on self-supervised graph representation. The source code of the proposed method is available at https://github.com/DuanhaoranCC/CGRA.<br />Competing Interests: Declaration of competing interest The authors declare the following financial interests/personal relationships which may be considered as potential competing interests: Cheng Xie reports financial support was provided by National Natural Science Foundation of China. Cheng Xie reports a relationship with Yunnan University that includes: employment.<br /> (Copyright © 2023 The Author(s). Published by Elsevier Ltd.. All rights reserved.)

Details

Language :
English
ISSN :
1879-2782
Volume :
167
Database :
MEDLINE
Journal :
Neural networks : the official journal of the International Neural Network Society
Publication Type :
Academic Journal
Accession number :
37660671
Full Text :
https://doi.org/10.1016/j.neunet.2023.08.039