Sharp variance-entropy comparison for nonnegative Gaussian quadratic forms
In this article we study weighted sums of $n$ i.i.d. Gamma($\alpha$) random variables with nonnegative weights. We show that for $n \geq 1/\alpha$ the sum with equal coefficients maximizes differential entropy when variance is fixed. As a consequence, we prove that among nonnegative quadratic forms...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
24.05.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this article we study weighted sums of $n$ i.i.d. Gamma($\alpha$) random
variables with nonnegative weights. We show that for $n \geq 1/\alpha$ the sum
with equal coefficients maximizes differential entropy when variance is fixed.
As a consequence, we prove that among nonnegative quadratic forms in $n$
independent standard Gaussian random variables, a diagonal form with equal
coefficients maximizes differential entropy, under a fixed variance. This
provides a sharp lower bound for the relative entropy between a nonnegative
quadratic form and a Gaussian random variable. Bounds on capacities of
transmission channels subject to $n$ independent additive gamma noises are also
derived. |
---|---|
DOI: | 10.48550/arxiv.2005.11705 |