Back to Search
Start Over
Deep Clustering via Weighted $k$-Subspace Network.
- Source :
- IEEE Signal Processing Letters; Nov2019, Vol. 26 Issue 11, p1628-1632, 5p
- Publication Year :
- 2019
-
Abstract
- Subspace clustering aims to separate the data into clusters under the hypothesis that the samples within the same cluster will lie in the same low-dimensional subspace. Due to the tough pairwise constraints, $k$ -subspace clustering is sensitive to outliers and initialization. In this letter, we present a novel deep architecture for $k$ -subspace clustering to address this issue, called as Deep Weighted $k$ -Subspace Clustering (DWSC). Specifically, our framework consists of autoencoder and weighted $k$ -subsapce network. We first use the autoencoder to non-linearly compress the samples into the low-dimensional latent space. In the weighted $k$ -subspace network, we feed the latent representation into the assignment network to output soft assignments which represent the probability of data belonging to the according subspace. Subsequently, the optimal $k$ subspaces are identified by minimizing the projection residuals of the latent representations to all subspaces, using the learned soft assignments as a weighting vector. Finally, we jointly optimize the representation learning and clustering in a unified framework. Experimental results show that our approach outperforms the state-of-the-art subspace clustering methods on two benchmark datasets. [ABSTRACT FROM AUTHOR]
- Subjects :
- LINEAR programming
FEATURE extraction
Subjects
Details
- Language :
- English
- ISSN :
- 10709908
- Volume :
- 26
- Issue :
- 11
- Database :
- Complementary Index
- Journal :
- IEEE Signal Processing Letters
- Publication Type :
- Academic Journal
- Accession number :
- 140082344
- Full Text :
- https://doi.org/10.1109/LSP.2019.2941368