Reversible jump and the label switching problem in hidden Markov models

Reversible jump Markov chain Monte Carlo (RJMCMC) algorithms can be efficiently applied in Bayesian inference for hidden Markov models (HMMs), when the number of latent regimes is unknown. As for finite mixture models, when priors are invariant to the relabelling of the regimes, HMMs are unidentifia...

Full description

Saved in:
Bibliographic Details
Published inJournal of statistical planning and inference Vol. 139; no. 7; pp. 2305 - 2315
Main Author Spezia, Luigi
Format Journal Article
LanguageEnglish
Published Kidlington Elsevier B.V 01.07.2009
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Reversible jump Markov chain Monte Carlo (RJMCMC) algorithms can be efficiently applied in Bayesian inference for hidden Markov models (HMMs), when the number of latent regimes is unknown. As for finite mixture models, when priors are invariant to the relabelling of the regimes, HMMs are unidentifiable in data fitting, because multiple ways to label the regimes can alternate during the MCMC iterations; this is the so-called label switching problem. HMMs with an unknown number of regimes are considered here and the goal of this paper is the comparison, both applied and theoretical, of five methods used for tackling label switching within a RJMCMC algorithm; they are: post-processing, partial reordering, permutation sampling, sampling from a Markov prior and rejection sampling. The five strategies we compare have been proposed mostly in the literature of finite mixture models and only two of them, i.e. rejection sampling and partial reordering, have been presented in RJMCMC algorithms for HMMs. We consider RJMCMC algorithms in which the parameters are updated by Gibbs sampling and the dimension of the model changes in split-and-merge and birth-and-death moves. Finally, an example illustrates and compares the five different methodologies.
ISSN:0378-3758
1873-1171
DOI:10.1016/j.jspi.2008.10.016