On the Importance of Temporal Dependencies of Weight Updates in Communication Efficient Federated Learning

This paper studies the effect of exploiting temporal dependency of successive weight updates on compressing communications in Federated Learning (FL). For this, we propose residual coding for FL, which utilizes temporal dependencies by communicating compressed residuals of the weight updates wheneve...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE International Conference on Visual Communications and Image Processing (VCIP) pp. 1 - 5
Main Authors Afrabandpey, Homayun, Rangu, Goutham, Zhang, Honglei, Criri, Francesco, Aksu, Emre, Tavakoli, Hamed R.
Format Conference Proceeding
LanguageEnglish
Published IEEE 13.12.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper studies the effect of exploiting temporal dependency of successive weight updates on compressing communications in Federated Learning (FL). For this, we propose residual coding for FL, which utilizes temporal dependencies by communicating compressed residuals of the weight updates whenever they are beneficial to bandwidth. We further consider Temporal Context Adaptation (TCA) which compares co-located elements of consecutive weight updates to select optimal setting for compression of bitstream in DeepCABAC encoder. Following experimental settings of MPEG standard on Neural Network Compression (NNC), we demonstrate that both temporal dependency based technologies reduce communication overhead, where the maximum reduction is obtained using both technologies, simultaneously.
ISSN:2642-9357
DOI:10.1109/VCIP56404.2022.10008860