Unsupervised Clustering through Gaussian Mixture Variational AutoEncoder with Non-Reparameterized Variational Inference and Std Annealing

Clustering has long been an important research topic in machine learning, and is highly valuable in many application tasks. In recent years, many methods have achieved high clustering performance by applying deep generative models. In this paper, we point out that directly using q(z|y, x) instead of...

Full description

Saved in:
Bibliographic Details
Published in2020 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 8
Main Authors Li, Zhihan, Zhao, Youjian, Xu, Haowen, Chen, Wenxiao, Xu, Shangqing, Li, Yilin, Pei, Dan
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.07.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Clustering has long been an important research topic in machine learning, and is highly valuable in many application tasks. In recent years, many methods have achieved high clustering performance by applying deep generative models. In this paper, we point out that directly using q(z|y, x) instead of resorting to the mean-field approximation (as is adopted in previous works) in Gaussian Mixture Variational Auto-Encoder can benefit the unsupervised clustering task. We improve the performance of Gaussian Mixture VAE, by optimizing it with a Monte Carlo objective (including the q(z|y, x) term), with non-reparameterized Variational Inference for Monte Carlo Objectives (VIMCO) method. In addition, we propose std annealing to stabilize the training process and empirically show its effects on forming well-separated embeddings with different variational inference methods. Experimental results on five benchmark datasets show that our proposed algorithm NVISA outperforms several baseline algorithms as well as the previous clustering methods based on Gaussian Mixture VAE.
ISSN:2161-4407
DOI:10.1109/IJCNN48605.2020.9207493