Generating random bits from an arbitrary source: fundamental limits

Suppose we are given a random source and want to use it as a random number generator; at what rate can we generate fair bits from it? We address this question in an information-theoretic setting by allowing for some arbitrarily small but nonzero deviation from "ideal" random bits. We prove...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information theory Vol. 41; no. 5; pp. 1322 - 1332
Main Authors Vembu, S., Verdu, S.
Format Journal Article
LanguageEnglish
Published New York IEEE 01.09.1995
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Suppose we are given a random source and want to use it as a random number generator; at what rate can we generate fair bits from it? We address this question in an information-theoretic setting by allowing for some arbitrarily small but nonzero deviation from "ideal" random bits. We prove our results with three different measures of approximation between the ideal and the obtained probability distributions: the variational distance, the d-bar distance, and the normalized divergence. Two different contexts are studied: fixed-length and variable-length random number generation. The fixed-length results of this paper provide an operational characterization of the inf-entropy rate of a source, defined in Han and Verdu (see ibid., vol.39, no.3, p.752-772, 1993) and the variable-length results characterize the liminf of the entropy rate, thereby establishing a pleasing duality with the fundamental limits of source coding. A feature of our results is that we do not restrict ourselves to ergodic or to stationary sources.< >
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0018-9448
1557-9654
DOI:10.1109/18.412679