Distributionally Robust Federated Learning for Mobile Edge Networks
Federated Learning (FL) revolutionizes data processing in mobile networks by enabling collaborative learning without data exchange. This not only reduces latency and enhances computational efficiency but also enables the system to adapt, learn and optimize the performance from the user’s context in...
Saved in:
Published in | Mobile networks and applications Vol. 29; no. 1; pp. 262 - 272 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.02.2024
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Federated Learning (FL) revolutionizes data processing in mobile networks by enabling collaborative learning without data exchange. This not only reduces latency and enhances computational efficiency but also enables the system to adapt, learn and optimize the performance from the user’s context in real-time. Nevertheless, FL faces challenges in training and generalization due to statistical heterogeneity, stemming from the diverse data nature across varying user contexts. To address these challenges, we propose
WAFL
, a robust FL framework grounded in Wasserstein distributionally robust optimization, aimed at enhancing model generalization against all adversarial distributions within a predefined Wasserstein ambiguity set. We approach
WAFL
by formulating it as an empirical surrogate risk minimization problem, which is then solved using a novel federated algorithm. Experimental results demonstrate that
WAFL
outperforms other robust FL baselines in non-i.i.d settings, showcasing superior generalization and robustness to significant distribution shifts. |
---|---|
ISSN: | 1383-469X 1572-8153 |
DOI: | 10.1007/s11036-024-02316-w |