Communication-Efficient Federated Multi-Task Learning with Sparse Sharing
Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client learns individual models locally and the server extracts similar model parameters from the tasks to keep personalization for models of clients...
Saved in:
Published in | 2023 IEEE 34th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC) pp. 1 - 6 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
05.09.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client learns individual models locally and the server extracts similar model parameters from the tasks to keep personalization for models of clients. Hence, it is essential to precisely extract the model parameters shared among tasks. On the other aspect, the limitation of communication resources would also restrict the model transmission, and thus influence the FMTL performance. To address the above issues, we propose a novel FMTL with Sparse Sharing (FedSS) mechanism that allows clients to share model parameters dynamically according to diversified model structures under limited communication resources. Particularly, we present an adaptive quantization approach for task relevance, which serves as a metric to evaluate the extent of model sharing across tasks. The objective function is formulated to minimize the model transmission latency while ensure the FMTL learning performance via a joint bandwidth allocation and client selection strategy. Closed-form expressions for the optimal client selection and bandwidth allocation are derived based on a alternating direction method of multipliers (ADMM) algorithm. Numerical results show that the proposed FedSS outperforms the benchmarks, and achieves efficient communication performance. |
---|---|
AbstractList | Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client learns individual models locally and the server extracts similar model parameters from the tasks to keep personalization for models of clients. Hence, it is essential to precisely extract the model parameters shared among tasks. On the other aspect, the limitation of communication resources would also restrict the model transmission, and thus influence the FMTL performance. To address the above issues, we propose a novel FMTL with Sparse Sharing (FedSS) mechanism that allows clients to share model parameters dynamically according to diversified model structures under limited communication resources. Particularly, we present an adaptive quantization approach for task relevance, which serves as a metric to evaluate the extent of model sharing across tasks. The objective function is formulated to minimize the model transmission latency while ensure the FMTL learning performance via a joint bandwidth allocation and client selection strategy. Closed-form expressions for the optimal client selection and bandwidth allocation are derived based on a alternating direction method of multipliers (ADMM) algorithm. Numerical results show that the proposed FedSS outperforms the benchmarks, and achieves efficient communication performance. |
Author | Ai, Yuhan Chen, Qimei Jiang, Hao Liang, Yipeng |
Author_xml | – sequence: 1 givenname: Yuhan surname: Ai fullname: Ai, Yuhan email: aiyuhan@whu.edu.cn organization: Wuhan University,School of Electronic Information,Wuhan,China,430072 – sequence: 2 givenname: Qimei surname: Chen fullname: Chen, Qimei email: chenqimei@whu.edu.cn organization: Wuhan University,School of Electronic Information,Wuhan,China,430072 – sequence: 3 givenname: Yipeng surname: Liang fullname: Liang, Yipeng email: liangyipeng@whu.edu.cn organization: Wuhan University,School of Electronic Information,Wuhan,China,430072 – sequence: 4 givenname: Hao surname: Jiang fullname: Jiang, Hao email: jianghao@whu.edu.cn organization: Wuhan University,School of Electronic Information,Wuhan,China,430072 |
BookMark | eNo1j8FOwzAQRA0Cibb0Dzj4BxLWduJ4jyhqIVIrEC3nyk3W1NA4lZMK8fdEAk6jmcPTvCm7Cl0gxriAVAjA-5dq_VrmupAilSBVKkCiMgYu2FRonWc6M0pfsokcW4K5wRs27_sPAFDCGAVqwqqya9tz8LUdfBeShXO-9hQGvqSGoh2o4evzcfDJ1vaffEU2Bh_e-ZcfDnxzsrEnvjnYOG637NrZY0_zv5yxt-ViWz4lq-fHqnxYJV5CNiSNrBGdwgyRcJTYm8KiVjVgjk6bBrPxWu4a5UAQCGlVIWqnaa9JFlBkasbufrmeiHan6Fsbv3f_6uoHxchPeA |
ContentType | Conference Proceeding |
DBID | 6IE 6IL CBEJK RIE RIL |
DOI | 10.1109/PIMRC56721.2023.10293880 |
DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume IEEE Xplore All Conference Proceedings IEEE Electronic Library Online IEEE Proceedings Order Plans (POP All) 1998-Present |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: RIE name: IEEE/IET Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISBN | 1665464836 9781665464833 |
EISSN | 2166-9589 |
EndPage | 6 |
ExternalDocumentID | 10293880 |
Genre | orig-research |
GrantInformation_xml | – fundername: Fundamental Research Funds for the Central Universities funderid: 10.13039/501100012226 – fundername: Research and Development funderid: 10.13039/100006190 |
GroupedDBID | 6IE 6IF 6IH 6IK 6IL 6IM AAJGR ADZIZ ALMA_UNASSIGNED_HOLDINGS CBEJK CHZPO IPLJI OCL RIE RIL |
ID | FETCH-LOGICAL-i204t-d2c99f39499e9109b87a963c0959f68d941885fd3f01e012a371cf6eb6e270743 |
IEDL.DBID | RIE |
IngestDate | Wed Jun 26 19:24:42 EDT 2024 |
IsPeerReviewed | false |
IsScholarly | false |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-i204t-d2c99f39499e9109b87a963c0959f68d941885fd3f01e012a371cf6eb6e270743 |
PageCount | 6 |
ParticipantIDs | ieee_primary_10293880 |
PublicationCentury | 2000 |
PublicationDate | 2023-Sept.-5 |
PublicationDateYYYYMMDD | 2023-09-05 |
PublicationDate_xml | – month: 09 year: 2023 text: 2023-Sept.-5 day: 05 |
PublicationDecade | 2020 |
PublicationTitle | 2023 IEEE 34th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC) |
PublicationTitleAbbrev | PIMRC |
PublicationYear | 2023 |
Publisher | IEEE |
Publisher_xml | – name: IEEE |
SSID | ssj0003188303 |
Score | 1.8918982 |
Snippet | Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client... |
SourceID | ieee |
SourceType | Publisher |
StartPage | 1 |
SubjectTerms | Adaptation models bandwidth allocation Channel allocation client selection Closed-form solutions Data models Federated multi-task learning Multitasking Numerical models Quantization (signal) sparse sharing |
Title | Communication-Efficient Federated Multi-Task Learning with Sparse Sharing |
URI | https://ieeexplore.ieee.org/document/10293880 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LawIxEB6qp_bSl6Vvcug1230mm7MoWlCkVfAmm2S2FEGlXS_99Z1E19ZCobcQyBAmyc5j5_sG4CHG0EoZIVcSDU-VzXmutKVQJRLCFIUUvn_KYCh6k_Rpmk23YHWPhUFEX3yGgRv6f_l2adYuVUYvnIwTXbgGNKRSG7DWLqFClzOn73FdrROqx1F_8NzOBMU4gWsSHtTL9xqpeDvSPYZhvYNN-cg8WFc6MJ-_yBn_vcUTaH1D9thoZ4xO4QAXZ3D0g23wHPp7YBDe8eQRJI11HaEE-ZyWeTguHxcfc7YlXn1lLlPLXlYUACNz9M4014JJtzNu9_i2kwJ_i8O04jY2SpWJI6JB8g-UzmVBL8-4JGApcqtSUl5W2qQMIySTVSQyMqVALTCWzsm4gOZiucBLYGitzRJZohI6LbUsrI9Z0oRkGxJyBS2nldlqQ5YxqxVy_cf8DRy6w_FlW9ktNKv3Nd6Rna_0vT_fLysqplA |
link.rule.ids | 310,311,783,787,792,793,799,23944,23945,25154,27939,55088 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LTwIxEJ4oHtSLL4xve_DadZ_t9kwkoECIQsKN7LazxpAA0eXir3daWBQTE29ND5Nm-phH5_sG4C5E30gZIFcSNY-VSXmqckOhSiCEzjIpXP-Ubk-0hvHjKBmtwOoOC4OIrvgMPTt0f_lmphc2VUY3nIwTHbht2EmsY7GEa61TKnQ8U3qRq3odX933293nRiIoyvFsm3CvErDRSsVZkuYB9Ko1LAtIJt6izD39-Yue8d-LPIT6N2iP9dfm6Ai2cHoM-z_4Bk-gvQEH4Q-OPoKksaallCCv0zAHyOWD7GPCVtSrr8zmatnLnEJgZJbgmebqMGw-DBotvuqlwN9CPy65CbVSRWSpaJA8BJWnMqO7p20asBCpUTEpLylMVPgBktHKIhnoQmAuMJTWzTiF2nQ2xTNgaIxJIlmgEnlc5DIzLmqJI5KtScg51K1WxvMlXca4UsjFH_O3sNsadDvjTrv3dAl7dqNcEVdyBbXyfYHXZPXL_Mbt9RcShqmd |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=proceeding&rft.title=2023+IEEE+34th+Annual+International+Symposium+on+Personal%2C+Indoor+and+Mobile+Radio+Communications+%28PIMRC%29&rft.atitle=Communication-Efficient+Federated+Multi-Task+Learning+with+Sparse+Sharing&rft.au=Ai%2C+Yuhan&rft.au=Chen%2C+Qimei&rft.au=Liang%2C+Yipeng&rft.au=Jiang%2C+Hao&rft.date=2023-09-05&rft.pub=IEEE&rft.eissn=2166-9589&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FPIMRC56721.2023.10293880&rft.externalDocID=10293880 |