A Novel Gaussian Mixture Model for Classification

Gaussian Mixture Model (GMM) is a probabilistic model for representing normally distributed subpopulations within an overall population. It is usually used for unsupervised learning to learn the subpopulations and the subpopulation assignment automatically. It is also used for supervised learning or...

Full description

Saved in:
Bibliographic Details
Published in2019 IEEE International Conference on Systems, Man and Cybernetics (SMC) pp. 3298 - 3303
Main Authors Wan, Huan, Wang, Hui, Scotney, Bryan, Liu, Jun
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gaussian Mixture Model (GMM) is a probabilistic model for representing normally distributed subpopulations within an overall population. It is usually used for unsupervised learning to learn the subpopulations and the subpopulation assignment automatically. It is also used for supervised learning or classification to learn the boundary of subpopulations. However, the performance of GMM as a classifier is not impressive compared with other conventional classifiers such as k-nearest neighbors (KNN), support vector machine (SVM), decision tree and naive Bayes. In this paper, we attempt to address this problem. We propose a GMM classifier, SC-GMM, based on the separability criterion in order to separate the Gaussian models as much as possible. This classifier finds the optimal number of Gaussian components for each class based on the separability criterion and then determines the parameters of these Gaussian components by using the expectation maximization algorithm. Extensive experiments have been carried out on classification tasks from general data mining to face verification. Results show that SC-GMM significantly outperforms the original GMM classifier. Results also show that SC-GMM is comparable in classification accuracy to three variants of GMM classifier: Akaike Information Criterion based GMM (AIC-GMM), Bayesian Information Criterion based GMM (BIC-GMM) and variational Bayesian gaussian mixture (VBGM). However, SC-GMM is significantly more efficient than both AIC-GMM and BIC-GMM. Furthermore, compared with KNN, SVM, decision tree and naive Bayes, SC-GMM achieves competitive classification performance.
ISSN:2577-1655
DOI:10.1109/SMC.2019.8914215