RMT "single-cluster" criterion for predicting large errors (outliers) in maximum-likelihood detection-estimation

We investigate the sudden onset of failure in maximum-likelihood (ML) detection-estimation on multivariate Gaussian models with a critically small number of data samples (observations). Using methods from random matrix theory (RMT) [also known as generalised statistical analysis (GSA) or G-analysis]...

Full description

Saved in:
Bibliographic Details
Published in2009 IEEE/SP 15th Workshop on Statistical Signal Processing pp. 241 - 244
Main Authors Abramovich, Yu.I., Johnson, B.A., Spencer, N.K.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2009
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We investigate the sudden onset of failure in maximum-likelihood (ML) detection-estimation on multivariate Gaussian models with a critically small number of data samples (observations). Using methods from random matrix theory (RMT) [also known as generalised statistical analysis (GSA) or G-analysis], we demonstrate that, for any set of true (exact) data parameters, we can identify a parametric space of covariance matrix models that are statistically as likely as the true one. The continuum of such equally likely models defines the nonidentifiability ldquoambiguity regionrdquo of the ML estimation (MLE). When this region includes models with completely erroneous parameters (ldquooutliersrdquo), MLE ldquoperformance breakdownrdquo is predicted.
ISBN:9781424427093
1424427096
ISSN:2373-0803
2693-3551
DOI:10.1109/SSP.2009.5278593