Privacy-Enhanced Federated Learning for Non-IID Data

Federated learning (FL) allows the collaborative training of a collective model by a vast number of decentralized clients while ensuring that these clients’ data remain private and are not shared. In practical situations, the training data utilized in FL often exhibit non-IID characteristics, hence...

Full description

Saved in:
Bibliographic Details
Published inMathematics (Basel) Vol. 11; no. 19; p. 4123
Main Authors Tan, Qingjie, Wu, Shuhui, Tao, Yuanhong
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.10.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Federated learning (FL) allows the collaborative training of a collective model by a vast number of decentralized clients while ensuring that these clients’ data remain private and are not shared. In practical situations, the training data utilized in FL often exhibit non-IID characteristics, hence diminishing the efficacy of FL. Our study presents a novel privacy-preserving FL algorithm, HW-DPFL, which leverages data label distribution similarity as a basis for its design. Our proposed approach achieves this objective without incurring any additional overhead communication. In this study, we provide evidence to support the assertion that our approach improves the privacy guarantee and convergence of FL both theoretically and empirically.
ISSN:2227-7390
2227-7390
DOI:10.3390/math11194123