Higher-Order Interactions and Their Duals Reveal Synergy and Logical Dependence beyond Shannon-Information

Information-theoretic quantities reveal dependencies among variables in the structure of joint, marginal, and conditional entropies while leaving certain fundamentally different systems indistinguishable. Furthermore, there is no consensus on the correct higher-order generalisation of mutual informa...

Full description

Saved in:
Bibliographic Details
Published inEntropy (Basel, Switzerland) Vol. 25; no. 4; p. 648
Main Author Jansma, Abel
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 12.04.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Information-theoretic quantities reveal dependencies among variables in the structure of joint, marginal, and conditional entropies while leaving certain fundamentally different systems indistinguishable. Furthermore, there is no consensus on the correct higher-order generalisation of mutual information (MI). In this manuscript, we show that a recently proposed model-free definition of higher-order interactions among binary variables (MFIs), such as mutual information, is a Möbius inversion on a Boolean algebra, except of surprisal instead of entropy. This provides an information-theoretic interpretation to the MFIs, and by extension to Ising interactions. We study the objects dual to mutual information and the MFIs on the order-reversed lattices. We find that dual MI is related to the previously studied differential mutual information, while dual interactions are interactions with respect to a different background state. Unlike (dual) mutual information, interactions and their duals uniquely identify all six 2-input logic gates, the dy- and triadic distributions, and different causal dynamics that are identical in terms of their Shannon information content.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1099-4300
1099-4300
DOI:10.3390/e25040648