Entropic measures, Markov information sources and complexity

The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest progra...

Full description

Saved in:
Bibliographic Details
Published inApplied mathematics and computation Vol. 132; no. 2; pp. 369 - 384
Main Authors Calude, Cristian S., Dumitrescu, Monica
Format Journal Article
LanguageEnglish
Published New York, NY Elsevier Inc 10.11.2002
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for discrete or continuous Markov sources, with finite or continuous alphabets, and their relations to program-size complexity and algorithmic probability. The accent is on ideas, constructions and results; no proofs will be given.
ISSN:0096-3003
1873-5649
DOI:10.1016/S0096-3003(01)00199-0