Communication-Efficient Federated Multi-Task Learning with Sparse Sharing

Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client learns individual models locally and the server extracts similar model parameters from the tasks to keep personalization for models of clients...

Full description

Saved in:
Bibliographic Details
Published in2023 IEEE 34th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC) pp. 1 - 6
Main Authors Ai, Yuhan, Chen, Qimei, Liang, Yipeng, Jiang, Hao
Format Conference Proceeding
LanguageEnglish
Published IEEE 05.09.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client learns individual models locally and the server extracts similar model parameters from the tasks to keep personalization for models of clients. Hence, it is essential to precisely extract the model parameters shared among tasks. On the other aspect, the limitation of communication resources would also restrict the model transmission, and thus influence the FMTL performance. To address the above issues, we propose a novel FMTL with Sparse Sharing (FedSS) mechanism that allows clients to share model parameters dynamically according to diversified model structures under limited communication resources. Particularly, we present an adaptive quantization approach for task relevance, which serves as a metric to evaluate the extent of model sharing across tasks. The objective function is formulated to minimize the model transmission latency while ensure the FMTL learning performance via a joint bandwidth allocation and client selection strategy. Closed-form expressions for the optimal client selection and bandwidth allocation are derived based on a alternating direction method of multipliers (ADMM) algorithm. Numerical results show that the proposed FedSS outperforms the benchmarks, and achieves efficient communication performance.
ISSN:2166-9589
DOI:10.1109/PIMRC56721.2023.10293880