Distributed recursive least-squares with data-adaptive censoring
The deluge of networked big data motivates the development of computation- and communication-efficient network information processing algorithms. In this paper, we propose two data-adaptive censoring strategies that significantly reduce the computation and communication costs of the distributed recu...
Saved in:
Published in | 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 5860 - 5864 |
---|---|
Main Authors | , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.03.2017
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The deluge of networked big data motivates the development of computation- and communication-efficient network information processing algorithms. In this paper, we propose two data-adaptive censoring strategies that significantly reduce the computation and communication costs of the distributed recursive least-squares (D-RLS) algorithm. Through introducing a cost function that underrates the importance of those observations with small innovations, we develop the first censoring strategy based on the alternating minimization algorithm and the stochastic Newton method. It saves computation when a datum is censored. The computation and communication costs are further reduced by the second censoring strategy, which prohibits a node updating and transmitting its local estimate to neighbors when its current innovation is less than a threshold. For both strategies, a simple criterion for selecting the threshold of innovation is given so as to reach a target ratio of data reduction. The proposed censored D-RLS algorithms guarantee convergence to the optimal argument in the mean-square deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms. |
---|---|
ISSN: | 2379-190X |
DOI: | 10.1109/ICASSP.2017.7953280 |