Reweighted nonnegative least-mean-square algorithm

Statistical inference subject to nonnegativity constraints is a frequently occurring problem in learning problems. The nonnegative least-mean-square (NNLMS) algorithm was derived to address such problems in an online way. This algorithm builds on a fixed-point iteration strategy driven by the Karush...

Full description

Saved in:
Bibliographic Details
Published inSignal processing Vol. 128; pp. 131 - 141
Main Authors Chen, Jie, Richard, Cédric, Bermudez, José Carlos M.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.11.2016
Elsevier
Subjects
Online AccessGet full text
ISSN0165-1684
1872-7557
DOI10.1016/j.sigpro.2016.03.017

Cover

Loading…
More Information
Summary:Statistical inference subject to nonnegativity constraints is a frequently occurring problem in learning problems. The nonnegative least-mean-square (NNLMS) algorithm was derived to address such problems in an online way. This algorithm builds on a fixed-point iteration strategy driven by the Karush–Kuhn–Tucker conditions. It was shown to provide low variance estimates, but it however suffers from unbalanced convergence rates of these estimates. In this paper, we address this problem by introducing a variant of the NNLMS algorithm. We provide a theoretical analysis of its behavior in terms of transient learning curve, steady-state and tracking performance. We also introduce an extension of the algorithm for online sparse system identification. Monte-Carlo simulations are conducted to illustrate the performance of the algorithm and to validate the theoretical results. •We proposed a variant of NN-LMS algorithm with balanced weight convergence rates.•Accurate performance analysis is performed for a general nonstationarity model.•The sparse system identification problem can be solved via the derived algorithm.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0165-1684
1872-7557
DOI:10.1016/j.sigpro.2016.03.017