An estimator of the mutual information based on a criterion for conditional independence

The mutual information is a measure of stochastic dependence. Here, we present a data-dependent nonparametric estimator of the mutual information. The algorithm, upon which this estimator rests, is based on Dobrushin's information theorem. The key idea is to build a succession of finer and fine...

Full description

Saved in:
Bibliographic Details
Published inComputational statistics & data analysis Vol. 32; no. 1; pp. 1 - 17
Main Author Darbellay, Georges A.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 28.11.1999
Elsevier
SeriesComputational Statistics & Data Analysis
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The mutual information is a measure of stochastic dependence. Here, we present a data-dependent nonparametric estimator of the mutual information. The algorithm, upon which this estimator rests, is based on Dobrushin's information theorem. The key idea is to build a succession of finer and finer partitions made of nested hyperrectangles, and stop the refinement process on any hyperrectangle as soon as local independence has been achieved. The bias and variance of this estimator are studied through simulations. This includes a comparison to maximum likelihood estimators.
ISSN:0167-9473
1872-7352
DOI:10.1016/S0167-9473(99)00020-1