Sharp Variance-Entropy Comparison for Nonnegative Gaussian Quadratic Forms

In this article we study weighted sums of <inline-formula> <tex-math notation="LaTeX">n </tex-math></inline-formula> i.i.d. Gamma(<inline-formula> <tex-math notation="LaTeX">\alpha </tex-math></inline-formula>) random variables with...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information theory Vol. 67; no. 12; pp. 7740 - 7751
Main Authors Bartczak, Maciej, Nayar, Piotr, Zwara, Szymon
Format Journal Article
LanguageEnglish
Published New York IEEE 01.12.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this article we study weighted sums of <inline-formula> <tex-math notation="LaTeX">n </tex-math></inline-formula> i.i.d. Gamma(<inline-formula> <tex-math notation="LaTeX">\alpha </tex-math></inline-formula>) random variables with nonnegative weights. We show that for <inline-formula> <tex-math notation="LaTeX">n \geq 1/\alpha </tex-math></inline-formula> the sum with equal coefficients maximizes differential entropy when variance is fixed. As a consequence, we prove that among nonnegative quadratic forms in <inline-formula> <tex-math notation="LaTeX">n </tex-math></inline-formula> independent standard Gaussian random variables, a diagonal form with equal coefficients maximizes differential entropy, under a fixed variance. This provides a sharp lower bound for the relative entropy between a nonnegative quadratic form and a Gaussian random variable. Bounds on capacities of transmission channels subjects to <inline-formula> <tex-math notation="LaTeX">n </tex-math></inline-formula> independent additive gamma noises are also derived.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2021.3113281