Jisuanji kexue yu tansuo (Feb 2023)

Variational Deep Generative Clustering Model Under Entropy Regularizations

  • ZHANG Zhiyuan, CHEN Yarui, YANG Jianning, DING Wenqiang, YANG Jucheng

DOI
https://doi.org/10.3778/j.issn.1673-9418.2104091
Journal volume & issue
Vol. 17, no. 2
pp. 376 – 384

Abstract

Read online

The clustering method based on deep learning can automatically learn the latent features of data, and can be easily generalized to large-scale datasets with high-dimension. Traditional deep clustering methods pay more at-tention to extracting hidden layer features of data through deep neural networks to improve clustering accuracy, and less analyze the determinism of data categories in clustering tasks. At the same time, there is a lack of analysis of the discrete latent vector distribution after imposing constraints. This paper proposes a variational deep generative clustering model under entropy regularizations (VDGC-ER), which uses the variational auto-encoder as the basic framework and introduces the Gaussian mixture model as prior of the latent variables. This paper first proposes the sample entropy regularization term to the discrete latent vector of Gaussian mixture model to improve the clustering accuracy of the model. Further, this paper defines the aggregated sample entropy regularization term on the discrete latent vector to reduce the clustering imbalance, so that it can avoid local optimization and improve the generative diversity. Then, this paper uses the Monte Carlo sampling and re-parameterization strategies to estimate the optimi-zation objective of VDGC-ER model, and uses the stochastic gradient descent method to calculate the model para-meters. Finally, this paper designs the comparison experiments on MNIST, REUTERS, REUTERS-10K and HHAR datasets to demonstrate the performance of the VDGC-ER model. Experimental results show that the model can not only generate high quality samples, but also present high accuracy clustering.

Keywords