The Data-Constrained Generalized Maximum Entropy Estimator of the GLM: Asymptotic Theory and Inference

Maximum entropy methods of parameter estimation are appealing because they impose no additional structure on the data, other than that explicitly assumed by the analyst. In this paper we prove that the data constrained GME estimator of the general linear model is consistent and asymptotically normal...

Full description

Saved in:
Bibliographic Details
Published inEntropy (Basel, Switzerland) Vol. 15; no. 12; pp. 1756 - 1775
Main Authors Mittelhammer, Ron, Cardell, Nicholas, Marsh, Thomas
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.05.2013
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Maximum entropy methods of parameter estimation are appealing because they impose no additional structure on the data, other than that explicitly assumed by the analyst. In this paper we prove that the data constrained GME estimator of the general linear model is consistent and asymptotically normal. The approach we take in establishing the asymptotic properties concomitantly identifies a new computationally efficient method for calculating GME estimates. Formulae are developed to compute asymptotic variances and to perform Wald, likelihood ratio, and Lagrangian multiplier statistical tests on model parameters. Monte Carlo simulations are provided to assess the performance of the GME estimator in both large and small sample situations. Furthermore, we extend our results to maximum cross-entropy estimators and indicate a variant of the GME estimator that is unbiased. Finally, we discuss the relationship of GME estimators to Bayesian estimators, pointing out the conditions under which an unbiased GME estimator would be efficient.
ISSN:1099-4300
1099-4300
DOI:10.3390/e15051756