Back to Search Start Over

Clustering Analysis via Deep Generative Models With Mixture Models.

Authors :
Yang, Lin
Fan, Wentao
Bouguila, Nizar
Source :
IEEE Transactions on Neural Networks & Learning Systems. Jan2022, Vol. 33 Issue 1, p340-350. 11p.
Publication Year :
2022

Abstract

Clustering is a fundamental problem that frequently arises in many fields, such as pattern recognition, data mining, and machine learning. Although various clustering algorithms have been developed in the past, traditional clustering algorithms with shallow structures cannot excavate the interdependence of complex data features in latent space. Recently, deep generative models, such as autoencoder (AE), variational AE (VAE), and generative adversarial network (GAN), have achieved remarkable success in many unsupervised applications thanks to their capabilities for learning promising latent representations from original data. In this work, first we propose a novel clustering approach based on both Wasserstein GAN with gradient penalty (WGAN-GP) and VAE with a Gaussian mixture prior. By combining the WGAN-GP with VAE, the generator of WGAN-GP is formulated by drawing samples from the probabilistic decoder of VAE. Moreover, to provide more robust clustering and generation performance when outliers are encountered in data, a variant of the proposed deep generative model is developed based on a Student’s-t mixture prior. The effectiveness of our deep generative models is validated though experiments on both clustering analysis and samples generation. Through the comparison with other state-of-art clustering approaches based on deep generative models, the proposed approach can provide more stable training of the model, improve the accuracy of clustering, and generate realistic samples. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
33
Issue :
1
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
154800814
Full Text :
https://doi.org/10.1109/TNNLS.2020.3027761