OFedIT: Communication-Efficient Online Federated Learning with Intermittent Transmission

We study an online federated learning (OFL) where many edge nodes receive their own data sequentially and train a sequence of global functions (or models) under the orchestration of a central server while keeping data localized. In this framework, finding a communication-efficient algorithm is one o...

Full description

Saved in:
Bibliographic Details
Published in2022 13th International Conference on Information and Communication Technology Convergence (ICTC) pp. 1189 - 1192
Main Authors Kwon, Dohyeok, Park, Jonghwan, Hong, Songnam
Format Conference Proceeding
LanguageEnglish
Published IEEE 19.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We study an online federated learning (OFL) where many edge nodes receive their own data sequentially and train a sequence of global functions (or models) under the orchestration of a central server while keeping data localized. In this framework, finding a communication-efficient algorithm is one of the challenges for online federated learning (OFL). We present a communication-efficient OFL algorithm (named OFedIT) using intermittent transmissions. Our main contribution is to theoretically prove that OFedIT over T time slots achieves an optimal sublinear regret bound \mathcal{O}(\sqrt{T}) . Furthermore, this asymptotic optimality is ensured even when data- and system-heterogeneity are taken into account. Our analysis reveals that OFedIT yields the almost same performance as the centralized counterpart (i.e., all local data are gathered at the server) while having the advantages of communication cost and data-privacy.
ISSN:2162-1241
DOI:10.1109/ICTC55196.2022.9952884