Towards a Unification of Logic and Information Theory

This article introduces a theory of communication that covers the following generic scenario: Alice knows more than Bob about a certain set of logic propositions and Alice and Bob wish to communicate as efficiently as possible with the shared goal that, following their communication, Bob should be a...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Lastras, Luis A, Trager, Barry, Lenchner, Jonathan, Szpankowski, Wojtek, Wu, Chai Wah, Squillante, Mark, Gray, Alex
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 16.04.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This article introduces a theory of communication that covers the following generic scenario: Alice knows more than Bob about a certain set of logic propositions and Alice and Bob wish to communicate as efficiently as possible with the shared goal that, following their communication, Bob should be able to deduce a particular logic proposition that Alice knows to be true. We assume that our logic system is propositional logic, and we build on top of one of the legendary works in this area, namely the work of Carnap and Bar-Hillel on a theory of semantic information. Our main contribution is a collection of theorems studying various different assumptions on what Alice and Bob know and what their goal is. These theorems all provide sharp upper and lower bounds phrased in terms of an entropy-like function that we call \(\Lambda\), in reference to its apparent connection to problems of communication involving logic. It turns out that when the goal is to communicate only a portion of the knowledge that Alice possesses, the optimum communication cost is lower than most people seem to assume, yet unavoidably, such optimum communication strategies end up allowing Bob to prove even more things than originally intended. Another interesting outcome is that in some scenarios, Alice need not know the logic statements that Bob knows in order to attain asymptotically the same communication efficiency as if she knew the statement, in a nod to the famous Slepian-Wolf and Wyner-Ziv results from source coding theory. Our work also introduces practical codes, which are comprised of a combination of linear codes and enumerative source codes, which turn out to be asymptotically optimal for some scenarios.
ISSN:2331-8422