Back to Search Start Over

Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings.

Authors :
Guo J
Fan W
Amayri M
Bouguila N
Source :
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2024 Dec 04; Vol. 183, pp. 106979. Date of Electronic Publication: 2024 Dec 04.
Publication Year :
2024
Publisher :
Ahead of Print

Abstract

This article proposes a novel deep clustering model based on the variational autoencoder (VAE), named GamMM-VAE, which can learn latent representations of training data for clustering in an unsupervised manner. Most existing VAE-based deep clustering methods use the Gaussian mixture model (GMM) as a prior on the latent space. We employ a more flexible asymmetric Gamma mixture model to achieve higher quality embeddings of the data latent space. Second, since the Gamma is defined for strictly positive variables, in order to exploit the reparameterization trick of VAE, we propose a transformation method from Gaussian distribution to Gamma distribution. This method can also be considered a Gamma distribution reparameterization trick, allows gradients to be backpropagated through the sampling process in the VAE. Finally, we derive the evidence lower bound (ELBO) based on the Gamma mixture model in an effective way for the stochastic gradient variational Bayesian (SGVB) estimator to optimize the proposed model. ELBO, a variational inference objective, ensures the maximization of the approximation of the posterior distribution, while SGVB is a method used to perform efficient inference and learning in VAEs. We validate the effectiveness of our model through quantitative comparisons with other state-of-the-art deep clustering models on six benchmark datasets. Moreover, due to the generative nature of VAEs, the proposed model can generate highly realistic samples of specific classes without supervised information.<br />Competing Interests: Declaration of competing interest The authors declare the following financial interests/personal relationships which may be considered as potential competing interests: Nizar Bouguila reports financial support was provided by Natural Sciences and Engineering Research Council of Canada. Wentao Fan reports financial support was provided by National Natural Science Foundation of China. Wentao Fan reports financial support was provided by Guangdong Provincial Key Laboratory of Interdisciplinary Research and Application for Data Science. If there are other authors, they declare that they have no known competing financial interests or personal relatonships that could have appeared to influence the work reported in this paper.<br /> (Copyright © 2024 The Authors. Published by Elsevier Ltd.. All rights reserved.)

Details

Language :
English
ISSN :
1879-2782
Volume :
183
Database :
MEDLINE
Journal :
Neural networks : the official journal of the International Neural Network Society
Publication Type :
Academic Journal
Accession number :
39662201
Full Text :
https://doi.org/10.1016/j.neunet.2024.106979