OFedIT: Communication-Efficient Online Federated Learning with Intermittent Transmission
We study an online federated learning (OFL) where many edge nodes receive their own data sequentially and train a sequence of global functions (or models) under the orchestration of a central server while keeping data localized. In this framework, finding a communication-efficient algorithm is one o...
Saved in:
Published in | 2022 13th International Conference on Information and Communication Technology Convergence (ICTC) pp. 1189 - 1192 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
19.10.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We study an online federated learning (OFL) where many edge nodes receive their own data sequentially and train a sequence of global functions (or models) under the orchestration of a central server while keeping data localized. In this framework, finding a communication-efficient algorithm is one of the challenges for online federated learning (OFL). We present a communication-efficient OFL algorithm (named OFedIT) using intermittent transmissions. Our main contribution is to theoretically prove that OFedIT over T time slots achieves an optimal sublinear regret bound \mathcal{O}(\sqrt{T}) . Furthermore, this asymptotic optimality is ensured even when data- and system-heterogeneity are taken into account. Our analysis reveals that OFedIT yields the almost same performance as the centralized counterpart (i.e., all local data are gathered at the server) while having the advantages of communication cost and data-privacy. |
---|---|
ISSN: | 2162-1241 |
DOI: | 10.1109/ICTC55196.2022.9952884 |