On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
This paper considers the model of an arbitrarily distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean-square error of the noncausal estimator and the likelihood ratio between y and w are derived. This is followed by an exten...
Saved in:
Published in | IEEE transactions on information theory Vol. 51; no. 9; pp. 3017 - 3024 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
New York, NY
IEEE
01.09.2005
Institute of Electrical and Electronics Engineers The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper considers the model of an arbitrarily distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean-square error of the noncausal estimator and the likelihood ratio between y and w are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean-square error. These results are applied to derive infinite-dimensional versions of the Fisher information and the de Bruijn identity. A comparison between the causal and noncausal estimation errors yields a restricted form of the logarithmic Sobolev inequality. The derivation of the results is based on the Malliavin calculus |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/TIT.2005.853297 |