EM algorithms of Gaussian mixture model and hidden Markov model
The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are alwa...
Saved in:
Published in | 2001 International Conference on Image Processing Vol. 1; pp. 145 - 148 vol.1 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English Japanese |
Published |
IEEE
2001
|
Subjects | |
Online Access | Get full text |
ISBN | 0780367251 9780780367258 |
DOI | 10.1109/ICIP.2001.958974 |
Cover
Loading…
Summary: | The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (expectation-maximum) algorithm is a general method to improve the descent algorithm for finding the maximum likelihood estimation. The EM of HMM and the EM of GMM have similar formulae. Two points are proposed in this paper. One is that the EM of GMM can be regarded as a special EM of HMM. The other is that the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithm of GMM based on samples (or on observation) traditionally. |
---|---|
ISBN: | 0780367251 9780780367258 |
DOI: | 10.1109/ICIP.2001.958974 |