FLIS: Clustered Federated Learning via Inference Similarity for Non-IID Data Distribution

Conventional federated learning (FL) approaches are ineffective in scenarios where clients have significant differences in the distributions of their local data. The Non-IID data distribution in the client data causes a drift in the local model updates from the global optima, which significantly imp...

Full description

Saved in:
Bibliographic Details
Published inIEEE open journal of the Computer Society Vol. 4; pp. 1 - 12
Main Authors Morafah, Mahdi, Vahidian, Saeed, Wang, Weijia, Lin, Bill
Format Journal Article
LanguageEnglish
Published New York IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Conventional federated learning (FL) approaches are ineffective in scenarios where clients have significant differences in the distributions of their local data. The Non-IID data distribution in the client data causes a drift in the local model updates from the global optima, which significantly impacts the performance of the trained models. In this paper, we present a new algorithm called FLIS that aims to address this problem by grouping clients into clusters that have jointly trainable data distributions. This is achieved by comparing the inference similarity of client models. Our proposed framework captures settings where different groups of users may have their own objectives (learning tasks), but by aggregating their data with others in the same cluster (same learning task), superior models can be derived via more efficient and personalized federated learning. We present experimental results to demonstrate the benefits of FLIS over the state-of-the-art approaches on the CIFAR-100/10, SVHN, and FMNIST datasets. Our code is available at https://github.com/MMorafah/FLIS .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2644-1268
2644-1268
DOI:10.1109/OJCS.2023.3262203