EM algorithms of Gaussian mixture model and hidden Markov model

The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are alwa...

Full description

Saved in:
Bibliographic Details
Published in2001 International Conference on Image Processing Vol. 1; pp. 145 - 148 vol.1
Main Authors Guorong Xuan, Wei Zhang, Peiqi Chai
Format Conference Proceeding
LanguageEnglish
Japanese
Published IEEE 2001
Subjects
Online AccessGet full text
ISBN0780367251
9780780367258
DOI10.1109/ICIP.2001.958974

Cover

Loading…
More Information
Summary:The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (expectation-maximum) algorithm is a general method to improve the descent algorithm for finding the maximum likelihood estimation. The EM of HMM and the EM of GMM have similar formulae. Two points are proposed in this paper. One is that the EM of GMM can be regarded as a special EM of HMM. The other is that the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithm of GMM based on samples (or on observation) traditionally.
ISBN:0780367251
9780780367258
DOI:10.1109/ICIP.2001.958974