Information-Preserving Markov Aggregation

We present a sufficient condition for a non-injective function of a Markov chain to be a second-order Markov chain with the same entropy rate as the original chain. This permits an information-preserving state space reduction by merging states or, equivalently, lossless compression of a Markov sourc...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Geiger, Bernhard C, Temmel, Christoph
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 24.07.2013
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a sufficient condition for a non-injective function of a Markov chain to be a second-order Markov chain with the same entropy rate as the original chain. This permits an information-preserving state space reduction by merging states or, equivalently, lossless compression of a Markov source on a sample-by-sample basis. The cardinality of the reduced state space is bounded from below by the node degrees of the transition graph associated with the original Markov chain. We also present an algorithm listing all possible information-preserving state space reductions, for a given transition graph. We illustrate our results by applying the algorithm to a bi-gram letter model of an English text.
ISSN:2331-8422