Rate-Constrained Quantization for Communication-Efficient Federated Learning
Quantization is a common approach to mitigate the communication cost of federated learning (FL). In practice, the quantized local parameters are further encoded via an entropy coding technique, such as Huffman coding, for efficient data compression. In this case, the exact communication overhead is...
Saved in:
Published in | Proceedings of the ... IEEE International Conference on Acoustics, Speech and Signal Processing (1998) pp. 1 - 5 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
06.04.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Quantization is a common approach to mitigate the communication cost of federated learning (FL). In practice, the quantized local parameters are further encoded via an entropy coding technique, such as Huffman coding, for efficient data compression. In this case, the exact communication overhead is determined by the bit rate of the encoded gradients. Recognizing this fact, this work deviates from the existing approaches in the literature and develops a novel quantized FL framework, called rate-constrained federated learning (RC-FED), in which we deploy the conventional entropy-constrained scalar quantization technique to quantize the gradients subject to both fidelity and data rate constraints. Particularly, we formulate this scheme, as a joint optimization in which the quantization distortion is minimized while the rate of encoded gradients is kept below a target threshold. This enables for a tunable trade-off between quantization distortion and communication cost. We analyze the convergence behavior of RC-FED, and show its superior performance against baseline quantized FL schemes on several datasets. |
---|---|
ISSN: | 2379-190X |
DOI: | 10.1109/ICASSP49660.2025.10889213 |