Output statistics, equivocation, and state masking

Given a discrete memoryless channel and a target distribution on its output alphabet, one wishes to construct a length-$ n $ rate-$ R $ codebook such that the output distribution—computed over a codeword that is chosen uniformly at random—should be close to the $ n $-fold tensor product of the targe...

Full description

Saved in:
Bibliographic Details
Published inAIMS mathematics Vol. 10; no. 6; pp. 13151 - 13165
Main Author Wang, Ligong
Format Journal Article
LanguageEnglish
Published AIMS Press 01.01.2025
Subjects
Online AccessGet full text
ISSN2473-6988
2473-6988
DOI10.3934/math.2025590

Cover

Loading…
More Information
Summary:Given a discrete memoryless channel and a target distribution on its output alphabet, one wishes to construct a length-$ n $ rate-$ R $ codebook such that the output distribution—computed over a codeword that is chosen uniformly at random—should be close to the $ n $-fold tensor product of the target distribution. Here 'close' means that the relative entropy between the output distribution and said $ n $-fold product should be small. We characterize the smallest achievable relative entropy divided by $ n $ as $ n $ tends to infinity. We then demonstrate two applications of this result. The first application is an alternative proof of the achievability of the rate-equivocation region of the wiretap channel. The second application is a new capacity result for communication subject to state masking in the scenario where the decoder has access to channel-state information.
ISSN:2473-6988
2473-6988
DOI:10.3934/math.2025590