Parameter Bounds on Estimation Accuracy Under Model Misspecification
When the assumed data distribution differs from the true distribution, the model is said to be misspecified or mismatched. Model misspecification at some level is an inevitability of engineering practice. While Huber's celebrated work assesses maximum-likelihood (ML) performance under misspecif...
Saved in:
Published in | IEEE transactions on signal processing Vol. 63; no. 9; pp. 2263 - 2278 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
IEEE
01.05.2015
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | When the assumed data distribution differs from the true distribution, the model is said to be misspecified or mismatched. Model misspecification at some level is an inevitability of engineering practice. While Huber's celebrated work assesses maximum-likelihood (ML) performance under misspecification, no simple theory for bounding parameter estimation exists. The class of parameter bounds emerging from the covariance inequality, or equivalently the minimum norm theorem is revisited. The expectation operator is well-known to form an inner product space. Flexibility in the choice of expectation integrand and measure for integration exists, however, to establish a class of parameter bounds under a general form of model misspecification, i.e., distribution mismatch. The Cramér-Rao bound (CRB) primarily, and secondarily the Barankin/Hammersley-Chapman-Robbins, Bhattacharyya, and Bobrovsky-Mayer-Wolf-Zakai bounds under misspecification are considered. Huber's sandwich covariance is readily established as a special case of the misspecified CRB subject to ML constraints, and generalizations of the Slepian and Bangs formulae under misspecification are obtained. |
---|---|
ISSN: | 1053-587X 1941-0476 |
DOI: | 10.1109/TSP.2015.2411222 |