Performance analysis of the adaptive algorithm for bias-to-variance tradeoff

An algorithm for the mean squared error (MSE) minimization, through the bias-to-variance ratio optimization, has been recently proposed and used in the literature. This algorithm is based on the analysis of the intersection of confidence intervals (ICIs). The algorithm does not require explicit know...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on signal processing Vol. 52; no. 5; pp. 1228 - 1234
Main Author Stankovic, L.
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.05.2004
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:An algorithm for the mean squared error (MSE) minimization, through the bias-to-variance ratio optimization, has been recently proposed and used in the literature. This algorithm is based on the analysis of the intersection of confidence intervals (ICIs). The algorithm does not require explicit knowledge of the estimation bias for a "near to optimal" parameter estimation. This paper presents a detailed analysis of the algorithm performances, including procedures and relations that can be used for a fine adjustment of the algorithm parameters. Reliability of the algorithm is studied for various kinds of estimation noise. Results are confirmed on a simulated example with uniform, Gaussian, and Laplacian noise. An illustration of the algorithm application on a simple filtering example is given.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2004.826179