A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information

While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential...

Full description

Saved in:
Bibliographic Details
Published in2007 IEEE International Symposium on Information Theory pp. 46 - 50
Main Author Rioul, O.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2007
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.
ISBN:9781424413973
1424413974
ISSN:2157-8095
2157-8117
DOI:10.1109/ISIT.2007.4557202