Back to Search Start Over

Dual Information Enhanced Multiview Attributed Graph Clustering.

Authors :
Lin JQ
Chen MS
Zhu XR
Wang CD
Zhang H
Source :
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2024 May 30; Vol. PP. Date of Electronic Publication: 2024 May 30.
Publication Year :
2024
Publisher :
Ahead of Print

Abstract

Multiview attributed graph clustering is an important approach to partition multiview data based on the attribute characteristics and adjacent matrices from different views. Some attempts have been made in using graph neural network (GNN), which have achieved promising clustering performance. Despite this, few of them pay attention to the inherent specific information embedded in multiple views. Meanwhile, they are incapable of recovering the latent high-level representation from the low-level ones, greatly limiting the downstream clustering performance. To fill these gaps, a novel dual information enhanced multiview attributed graph clustering (DIAGC) method is proposed in this article. Specifically, the proposed method introduces the specific information reconstruction (SIR) module to disentangle the explorations of the consensus and specific information from multiple views, which enables graph convolutional network (GCN) to capture the more essential low-level representations. Besides, the contrastive learning (CL) module maximizes the agreement between the latent high-level representation and low-level ones and enables the high-level representation to satisfy the desired clustering structure with the help of the self-supervised clustering (SC) module. Extensive experiments on several real-world benchmarks demonstrate the effectiveness of the proposed DIAGC method compared with the state-of-the-art baselines.

Details

Language :
English
ISSN :
2162-2388
Volume :
PP
Database :
MEDLINE
Journal :
IEEE transactions on neural networks and learning systems
Publication Type :
Academic Journal
Accession number :
38814767
Full Text :
https://doi.org/10.1109/TNNLS.2024.3401449