On the Effectiveness of Hybrid Mutual Information Estimation

Estimating the mutual information from samples from a joint distribution is a challenging problem in both science and engineering. In this work, we realize a variational bound that generalizes both discriminative and generative approaches. Using this bound, we propose a hybrid method to mitigate the...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Federici, Marco, Ruhe, David, ré, Patrick
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 02.06.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Estimating the mutual information from samples from a joint distribution is a challenging problem in both science and engineering. In this work, we realize a variational bound that generalizes both discriminative and generative approaches. Using this bound, we propose a hybrid method to mitigate their respective shortcomings. Further, we propose Predictive Quantization (PQ): a simple generative method that can be easily combined with discriminative estimators for minimal computational overhead. Our propositions yield a tighter bound on the information thanks to the reduced variance of the estimator. We test our methods on a challenging task of correlated high-dimensional Gaussian distributions and a stochastic process involving a system of free particles subjected to a fixed energy landscape. Empirical results show that hybrid methods consistently improved mutual information estimates when compared to the corresponding discriminative counterpart.
ISSN:2331-8422