Properties of the Strong Data Processing Constant for Rényi Divergence

Strong data processing inequalities (SDPI) are an important object of study in Information Theory and have been well studied for f -divergences. Universal upper and lower bounds have been provided along with several applications, connecting them to impossibility (converse) results, concentration of...

Full description

Saved in:
Bibliographic Details
Published in2024 IEEE International Symposium on Information Theory (ISIT) pp. 3178 - 3183
Main Authors Jin, Lifu, Esposito, Amedeo Roberto, Gastpar, Michael
Format Conference Proceeding
LanguageEnglish
Published IEEE 07.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Strong data processing inequalities (SDPI) are an important object of study in Information Theory and have been well studied for f -divergences. Universal upper and lower bounds have been provided along with several applications, connecting them to impossibility (converse) results, concentration of measure, hypercontractivity, and so on. In this paper, we study Renyi divergence and the corresponding SDPI constant whose behavior seems to deviate from that of ordinary -divergences. In particular, one can find examples showing that the universal upper bound relating its SDPI constant to the one of Total Variation does not hold in general. In this work, we prove, however, that the universal lower bound involving the SDPI constant of the Chi-square divergence does indeed hold. Furthermore, we also provide a characterization of the distribution that achieves the supremum when is equal to 2 and consequently compute the SDPI constant for Renyi divergence of the general binary channel.
ISSN:2157-8117
DOI:10.1109/ISIT57864.2024.10619367