Generalization of the de Bruijn's identity to general \(\phi\)-entropies and \(\phi\)-Fisher informations
In this paper, we propose generalizations of the de Bruijn's identities based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundation of these generalizations are the \(\phi\)-entropies and divergences of the Csiszá's c...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
28.11.2016
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we propose generalizations of the de Bruijn's identities based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundation of these generalizations are the \(\phi\)-entropies and divergences of the Csiszá's class (or Salicrú's class) considered within a multidimensional context, included the monodimensional case, and for several type of noisy channels characterized by a more general probability distribution beyond the well-known Gaussian noise. It is found that the gradient and/or the hessian of these entropies or divergences with respect to the noise parameters give naturally rise to generalized versions of the Fisher information or divergence, which are named as the \(\phi\)-Fisher information (divergence). The obtained identities can be viewed as further extensions of the classical de Bruijn's identity. Analogously, it is shown that a similar relation holds between the \(\phi\)-divergence and a extended mean-square error, named \(\phi\)-mean square error, for the Gaussian channel. |
---|---|
ISSN: | 2331-8422 |