Back to Search Start Over

Adaptive multi-granularity sparse subspace clustering.

Authors :
Deng, Tingquan
Yang, Ge
Huang, Yang
Yang, Ming
Fujita, Hamido
Source :
Information Sciences. Sep2023, Vol. 642, pN.PAG-N.PAG. 1p.
Publication Year :
2023

Abstract

• The notion of scored kNN (SkNN) is proposed to guide the propagation of neighborhoods from fine granularity to coarse. • The existence of a solution to multi-granularity sparse subspace representation is proved. • The consistency of sparse subspace representation at all multi-granularity levels is captured effectively by the proposed AMGSSC. • The smoothness of learned sparse representation by using AMGLSSC increases the intra-class connectivity of adjacency graph. Sparse subspace clustering (SSC) focuses on revealing data distribution from algebraic perspectives and has been widely applied to high-dimensional data. The key to SSC is to learn the sparsest representation and derive an adjacency graph. Theoretically, the adjacency matrix with proper block diagonal structure leads to a desired clustering result. Various generalizations have been made through imposing Laplacian regularization or locally linear embedding to describe the manifold structure based on the nearest neighborhoods of samples. However, a single set of nearest neighborhoods cannot effectively characterize local information. From the perspective of granular computing, the notion of scored nearest neighborhoods is introduced to develop multi-granularity neighborhoods of samples. The multi-granularity representation of samples is integrated with SSC to collaboratively learn the sparse representation, and an adaptive multi-granularity sparse subspace clustering model (AMGSSC) is proposed. The learned adjacency matrix has a consistent block diagonal structure at all granularity levels. Furthermore, the locally linear relationship between samples is embedded in AMGSSC, and an enhanced AMGLSSC is developed to eliminate the over-sparsity of the learned adjacency graph. Experimental results show the superior performance of both models on several clustering criteria compared with state-of-the-art subspace clustering methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
642
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
164180836
Full Text :
https://doi.org/10.1016/j.ins.2023.119143