Causation and information flow with respect to relative entropy

Recently, a rigorous formalism has been established for information flow and causality within dynamical systems with respect to Shannon entropy. In this study, we re-establish the formalism with respect to relative entropy, or Kullback-Leiber divergence, a well-accepted measure of predictability bec...

Full description

Saved in:
Bibliographic Details
Published inChaos (Woodbury, N.Y.) Vol. 28; no. 7; p. 075311
Main Author Liang, X San
Format Journal Article
LanguageEnglish
Published United States 01.07.2018
Online AccessGet more information

Cover

Loading…
More Information
Summary:Recently, a rigorous formalism has been established for information flow and causality within dynamical systems with respect to Shannon entropy. In this study, we re-establish the formalism with respect to relative entropy, or Kullback-Leiber divergence, a well-accepted measure of predictability because of its appealing properties such as invariance upon nonlinear transformation and consistency with the second law of thermodynamics. Different from previous studies (which yield consistent results only for 2D systems), the resulting information flow, say T, is precisely the same as that with respect to Shannon entropy for systems of arbitrary dimensionality, except for a minus sign (reflecting the opposite notion of predictability vs. uncertainty). As before, T possesses a property called principle of nil causality, a fact that classical formalisms fail to verify in many situation. Besides, it proves to be invariant upon nonlinear transformation, indicating that the so-obtained information flow should be an intrinsic physical property. This formalism has been validated with the stochastic gradient system, a nonlinear system that admits an analytical equilibrium solution of the Boltzmann type.
ISSN:1089-7682
DOI:10.1063/1.5010253