Distributed recursive least-squares with data-adaptive censoring

The deluge of networked big data motivates the development of computation- and communication-efficient network information processing algorithms. In this paper, we propose two data-adaptive censoring strategies that significantly reduce the computation and communication costs of the distributed recu...

Full description

Saved in:
Bibliographic Details
Published in2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 5860 - 5864
Main Authors Zifeng Wang, Zheng Yu, Qing Ling, Berberidis, Dimitris, Giannakis, Georgios B.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.03.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The deluge of networked big data motivates the development of computation- and communication-efficient network information processing algorithms. In this paper, we propose two data-adaptive censoring strategies that significantly reduce the computation and communication costs of the distributed recursive least-squares (D-RLS) algorithm. Through introducing a cost function that underrates the importance of those observations with small innovations, we develop the first censoring strategy based on the alternating minimization algorithm and the stochastic Newton method. It saves computation when a datum is censored. The computation and communication costs are further reduced by the second censoring strategy, which prohibits a node updating and transmitting its local estimate to neighbors when its current innovation is less than a threshold. For both strategies, a simple criterion for selecting the threshold of innovation is given so as to reach a target ratio of data reduction. The proposed censored D-RLS algorithms guarantee convergence to the optimal argument in the mean-square deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms.
ISSN:2379-190X
DOI:10.1109/ICASSP.2017.7953280