Estimation of entropy measures for categorical variables with spatial correlation
Entropy is a measure of heterogeneity widely used in applied sciences, often when data are collected over space. Recently, a number of approaches has been proposed to include spatial information in entropy. The aim of entropy is to synthesize the observed data in a single, interpretable number. In o...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
09.11.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Entropy is a measure of heterogeneity widely used in applied sciences, often
when data are collected over space. Recently, a number of approaches has been
proposed to include spatial information in entropy. The aim of entropy is to
synthesize the observed data in a single, interpretable number. In other
studies the objective is, instead, to use data for entropy estimation; several
proposals can be found in the literature, which basically are corrections of
the estimator based on substituting the involved probabilities with
proportions. In this case, independence is assumed and spatial correlation is
not considered. We propose a path for spatial entropy estimation: instead of
correcting the global entropy estimator, we focus on improving the estimation
of its components, i.e. the probabilities, in order to account for spatial
effects. Once probabilities are suitably evaluated, estimating entropy is
straightforward since it is a deterministic function of the distribution.
Following a Bayesian approach, we derive the posterior probabilities of a
multinomial distribution for categorical variables, accounting for spatial
correlation. A posterior distribution for entropy can be obtained, which may be
synthesized as wished and displayed as an entropy surface for the area under
study. |
---|---|
DOI: | 10.48550/arxiv.1911.03685 |