Relative $\alpha$-Entropy Minimizers Subject to Linear Statistical Constraints

We study minimization of a parametric family of relative entropies, termed relative $\alpha$-entropies (denoted $\mathscr{I}_{\alpha}(P,Q)$). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These par...

Full description

Saved in:
Bibliographic Details
Main Authors Kumar, M. Ashok, Sundaresan, Rajesh
Format Journal Article
LanguageEnglish
Published 18.10.2014
Subjects
Online AccessGet full text
DOI10.48550/arxiv.1410.4931

Cover

Loading…
More Information
Summary:We study minimization of a parametric family of relative entropies, termed relative $\alpha$-entropies (denoted $\mathscr{I}_{\alpha}(P,Q)$). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $\alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimization of $\mathscr{I}_{\alpha}(P,Q)$ over the first argument on a set of probability distributions that constitutes a linear family is studied. Such a minimization generalizes the maximum Rényi or Tsallis entropy principle. The minimizing probability distribution (termed $\mathscr{I}_{\alpha}$-projection) for a linear family is shown to have a power-law.
DOI:10.48550/arxiv.1410.4931