An estimator of the mutual information based on a criterion for conditional independence
The mutual information is a measure of stochastic dependence. Here, we present a data-dependent nonparametric estimator of the mutual information. The algorithm, upon which this estimator rests, is based on Dobrushin's information theorem. The key idea is to build a succession of finer and fine...
Saved in:
Published in | Computational statistics & data analysis Vol. 32; no. 1; pp. 1 - 17 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
28.11.1999
Elsevier |
Series | Computational Statistics & Data Analysis |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The mutual information is a measure of stochastic dependence. Here, we present a data-dependent nonparametric estimator of the mutual information. The algorithm, upon which this estimator rests, is based on Dobrushin's information theorem. The key idea is to build a succession of finer and finer partitions made of nested hyperrectangles, and stop the refinement process on any hyperrectangle as soon as local independence has been achieved. The bias and variance of this estimator are studied through simulations. This includes a comparison to maximum likelihood estimators. |
---|---|
ISSN: | 0167-9473 1872-7352 |
DOI: | 10.1016/S0167-9473(99)00020-1 |