Entropic measures, Markov information sources and complexity
The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest progra...
Saved in:
Published in | Applied mathematics and computation Vol. 132; no. 2; pp. 369 - 384 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York, NY
Elsevier Inc
10.11.2002
Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for discrete or continuous Markov sources, with finite or continuous alphabets, and their relations to program-size complexity and algorithmic probability. The accent is on ideas, constructions and results; no proofs will be given. |
---|---|
ISSN: | 0096-3003 1873-5649 |
DOI: | 10.1016/S0096-3003(01)00199-0 |