Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has ut...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of selected topics in signal processing Vol. 16; no. 3; pp. 369 - 380
Main Authors Yue, Kai, Jin, Richeng, Wong, Chau-Wai, Dai, Huaiyu
Format Journal Article
LanguageEnglish
Published New York IEEE 01.04.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based compression scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate-distortion cost, and further reduce the redundancy with entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99% with even better learning performance when compared with other baseline methods.
ISSN:1932-4553
1941-0484
DOI:10.1109/JSTSP.2022.3142678