Fast-Convergent Federated Learning With Adaptive Weighting

Federated learning (FL) enables resource-constrained edge nodes to collaboratively learn a global model under the orchestration of a central server while keeping privacy-sensitive data locally. The non-independent-and-identically-distributed (non-IID) data samples across participating nodes slow mod...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on cognitive communications and networking Vol. 7; no. 4; pp. 1078 - 1088
Main Authors Wu, Hongda, Wang, Ping
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.12.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2332-7731
2332-7731
DOI10.1109/TCCN.2021.3084406

Cover

Loading…
Abstract Federated learning (FL) enables resource-constrained edge nodes to collaboratively learn a global model under the orchestration of a central server while keeping privacy-sensitive data locally. The non-independent-and-identically-distributed (non-IID) data samples across participating nodes slow model training and impose additional communication rounds for FL to converge. In this paper, we propose Fed erated Ad a p tive Weighting ( FedAdp ) algorithm that aims to accelerate model convergence under the presence of nodes with non-IID dataset. We observe the implicit connection between the node contribution to the global model aggregation and data distribution on the local node through theoretical and empirical analysis. We then propose to assign different weights for updating the global model based on node contribution adaptively through each training round. The contribution of participating nodes is first measured by the angle between the local gradient vector and the global gradient vector, and then, weight is quantified by a designed non-linear mapping function subsequently. The simple yet effective strategy can reinforce positive (suppress negative) node contribution dynamically, resulting in communication round reduction drastically. Its superiority over the commonly adopted Federated Averaging ( FedAvg ) is verified both theoretically and experimentally. With extensive experiments performed in Pytorch and PySyft, we show that FL training with FedAdp can reduce the number of communication rounds by up to 54.1% on MNIST dataset and up to 45.4% on FashionMNIST dataset, as compared to FedAvg algorithm.
AbstractList Federated learning (FL) enables resource-constrained edge nodes to collaboratively learn a global model under the orchestration of a central server while keeping privacy-sensitive data locally. The non-independent-and-identically-distributed (non-IID) data samples across participating nodes slow model training and impose additional communication rounds for FL to converge. In this paper, we propose Fed erated Ad a p tive Weighting ( FedAdp ) algorithm that aims to accelerate model convergence under the presence of nodes with non-IID dataset. We observe the implicit connection between the node contribution to the global model aggregation and data distribution on the local node through theoretical and empirical analysis. We then propose to assign different weights for updating the global model based on node contribution adaptively through each training round. The contribution of participating nodes is first measured by the angle between the local gradient vector and the global gradient vector, and then, weight is quantified by a designed non-linear mapping function subsequently. The simple yet effective strategy can reinforce positive (suppress negative) node contribution dynamically, resulting in communication round reduction drastically. Its superiority over the commonly adopted Federated Averaging ( FedAvg ) is verified both theoretically and experimentally. With extensive experiments performed in Pytorch and PySyft, we show that FL training with FedAdp can reduce the number of communication rounds by up to 54.1% on MNIST dataset and up to 45.4% on FashionMNIST dataset, as compared to FedAvg algorithm.
Author Wu, Hongda
Wang, Ping
Author_xml – sequence: 1
  givenname: Hongda
  surname: Wu
  fullname: Wu, Hongda
  email: hwu1226@eecs.yorku.ca
  organization: Department of Electrical Engineering and Computer Science, Lassonde School of Engineering, York University, Toronto, ON, Canada
– sequence: 2
  givenname: Ping
  orcidid: 0000-0002-1599-5480
  surname: Wang
  fullname: Wang, Ping
  email: pingw@yorku.ca
  organization: Department of Electrical Engineering and Computer Science, Lassonde School of Engineering, York University, Toronto, ON, Canada
BookMark eNp9kE1LAzEQhoNUsNb-APGy4HlrvjbJeiuLVaHopdJjSDeTNqVmazYt-O-7pUXEg6cZZt5nBp5r1AtNAIRuCR4RgsuHWVW9jSimZMSw4hyLC9SnjNFcSkZ6v_orNGzbNcaYCCqE4n30ODFtyqsm7CEuIaRsAhaiSWCzKZgYfFhmc59W2diabfJ7yObgl6vUzW_QpTObFobnOkAfk6dZ9ZJP359fq_E0r2nJUk6NYsI66wSrXY3L0kgHpiiZIIZZRQq7oAIvlDTUUcmBKWwtl7Lo9rIsKBug-9PdbWy-dtAmvW52MXQvdQdKJQXBqkvJU6qOTdtGcLr2ySTfhBSN32iC9dGVPrrSR1f67KojyR9yG_2nid__MncnxgPAT77knCrC2QGIp3TS
CODEN ITCCG7
CitedBy_id crossref_primary_10_1016_j_phycom_2023_102164
crossref_primary_10_1145_3678182
crossref_primary_10_1109_TNSE_2025_3528982
crossref_primary_10_1109_TSP_2025_3536023
crossref_primary_10_1109_OJCOMS_2024_3513816
crossref_primary_10_1109_TMC_2024_3402080
crossref_primary_10_1109_JSYST_2023_3236995
crossref_primary_10_1109_TCCN_2024_3394889
crossref_primary_10_1109_ACCESS_2022_3141913
crossref_primary_10_3390_s25051441
crossref_primary_10_1016_j_compbiomed_2024_108905
crossref_primary_10_1109_TVT_2023_3298787
crossref_primary_10_1016_j_future_2024_01_007
crossref_primary_10_1109_TNET_2024_3377655
crossref_primary_10_1145_3600225
crossref_primary_10_1088_1361_6501_acf77d
crossref_primary_10_1016_j_comcom_2025_108104
crossref_primary_10_1109_ACCESS_2024_3413069
crossref_primary_10_1109_TNSE_2022_3146399
crossref_primary_10_1109_OJCOMS_2024_3391731
crossref_primary_10_1145_3706419
crossref_primary_10_1145_3669903
crossref_primary_10_1016_j_ijepes_2023_109285
crossref_primary_10_1109_TVT_2021_3135332
crossref_primary_10_1016_j_knosys_2024_112484
crossref_primary_10_1002_int_23056
crossref_primary_10_1016_j_future_2023_09_008
crossref_primary_10_1109_TNET_2023_3323023
crossref_primary_10_1109_TSC_2023_3332102
crossref_primary_10_1109_TNSE_2024_3447904
crossref_primary_10_1109_TNSM_2023_3288738
crossref_primary_10_1109_JIOT_2023_3292797
crossref_primary_10_1016_j_comnet_2023_109712
crossref_primary_10_1145_3594779
crossref_primary_10_3390_math11194123
crossref_primary_10_1109_JSAC_2024_3431526
crossref_primary_10_1109_TMC_2023_3331906
crossref_primary_10_1109_TPDS_2023_3267897
crossref_primary_10_1049_cmu2_12379
crossref_primary_10_1109_TMC_2024_3365951
crossref_primary_10_1016_j_compag_2024_109720
crossref_primary_10_1109_JIOT_2024_3399259
crossref_primary_10_1016_j_future_2024_06_030
crossref_primary_10_1007_s13042_022_01647_y
crossref_primary_10_1109_TITS_2023_3265416
crossref_primary_10_3390_e26121099
crossref_primary_10_1016_j_knosys_2024_112937
crossref_primary_10_1016_j_jnca_2024_104086
crossref_primary_10_1109_JIOT_2024_3376548
crossref_primary_10_3390_electronics12143054
crossref_primary_10_1049_cmu2_12333
crossref_primary_10_1145_3697836
crossref_primary_10_1109_TAI_2023_3307664
crossref_primary_10_1016_j_knosys_2022_109960
crossref_primary_10_3390_app14072720
crossref_primary_10_1109_ACCESS_2023_3284976
crossref_primary_10_1007_s00530_024_01386_w
crossref_primary_10_1109_TAI_2024_3419757
crossref_primary_10_1016_j_cose_2023_103278
crossref_primary_10_1007_s10791_024_09478_x
crossref_primary_10_1016_j_cose_2023_103319
crossref_primary_10_1145_3701035
crossref_primary_10_1016_j_compeleceng_2024_109329
crossref_primary_10_1109_JBHI_2023_3319516
crossref_primary_10_1007_s11227_024_06514_x
crossref_primary_10_3390_math12060920
crossref_primary_10_1109_TCE_2024_3357530
crossref_primary_10_1016_j_comcom_2024_107964
crossref_primary_10_1109_TWC_2023_3301611
crossref_primary_10_1007_s10586_024_04837_1
crossref_primary_10_1016_j_hcc_2022_100068
crossref_primary_10_3390_s23229226
crossref_primary_10_1186_s13634_024_01192_6
Cites_doi 10.1109/INFOCOM41043.2020.9155494
10.1109/ICC.2019.8761315
10.1214/aoms/1177705673
10.1109/ICDCS.2019.00099
10.1109/JSAC.2020.3036952
10.1109/SOSE.2015.17
10.1109/COMST.2020.2970550
10.1109/INFOCOM.2019.8737464
10.1038/nature14539
10.1109/COMST.2020.2986024
10.1109/JSAC.2019.2904348
10.1109/TNNLS.2019.2953131
10.1109/MCOM.2011.6069707
10.1109/MCOM.2018.1701095
10.1109/JIOT.2016.2584538
10.1109/72.883477
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
DOI 10.1109/TCCN.2021.3084406
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList
Technology Research Database
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2332-7731
EndPage 1088
ExternalDocumentID 10_1109_TCCN_2021_3084406
9442814
Genre orig-research
GrantInformation_xml – fundername: Canada NSERC Discovery
  grantid: RGPIN-2019-06375
GroupedDBID 0R~
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABJNI
ABQJQ
ABVLG
ACGFS
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IES
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
RIA
RIE
AAYXX
CITATION
RIG
7SP
8FD
L7M
ID FETCH-LOGICAL-c293t-2a836dfdf63cfc099a7fea59361a3d815db260b87a2f274e380dd477561a79523
IEDL.DBID RIE
ISSN 2332-7731
IngestDate Mon Jun 30 05:38:20 EDT 2025
Tue Jul 01 01:43:25 EDT 2025
Thu Apr 24 23:03:48 EDT 2025
Wed Aug 27 05:11:52 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 4
Language English
License https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/Crown.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c293t-2a836dfdf63cfc099a7fea59361a3d815db260b87a2f274e380dd477561a79523
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-1599-5480
PQID 2607876108
PQPubID 4437218
PageCount 11
ParticipantIDs proquest_journals_2607876108
crossref_primary_10_1109_TCCN_2021_3084406
ieee_primary_9442814
crossref_citationtrail_10_1109_TCCN_2021_3084406
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-12-01
PublicationDateYYYYMMDD 2021-12-01
PublicationDate_xml – month: 12
  year: 2021
  text: 2021-12-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on cognitive communications and networking
PublicationTitleAbbrev TCCN
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References kone?n? (ref9) 2016
ref12
mcmahan (ref8) 2017
ref23
ref15
dinh (ref22) 2019
ref14
li (ref13) 2018
ref10
ref21
ref17
lecun (ref2) 2015; 521
ref16
lueth (ref1) 2019
ref19
ref18
ref7
chai (ref20) 2020
ref4
ref3
ref6
ref5
zhao (ref11) 2018
References_xml – year: 2016
  ident: ref9
  publication-title: Federated learning Strategies for improving communication efficiency
– year: 2018
  ident: ref11
  publication-title: Federated learning with non-IID data
– year: 2019
  ident: ref22
  publication-title: Federated learning over wireless networks Convergence analysis and resource allocation
– ident: ref12
  doi: 10.1109/INFOCOM41043.2020.9155494
– ident: ref16
  doi: 10.1109/ICC.2019.8761315
– ident: ref23
  doi: 10.1214/aoms/1177705673
– year: 2018
  ident: ref13
  publication-title: Federated optimization for heterogeneous networks
– ident: ref18
  doi: 10.1109/ICDCS.2019.00099
– ident: ref17
  doi: 10.1109/JSAC.2020.3036952
– ident: ref3
  doi: 10.1109/SOSE.2015.17
– ident: ref7
  doi: 10.1109/COMST.2020.2970550
– ident: ref15
  doi: 10.1109/INFOCOM.2019.8737464
– year: 2020
  ident: ref20
  publication-title: FedAT A communication-efficient federated learning method with asynchronous tiers under non-IID data
– volume: 521
  start-page: 436
  year: 2015
  ident: ref2
  article-title: Deep learning
  publication-title: Nature
  doi: 10.1038/nature14539
– ident: ref10
  doi: 10.1109/COMST.2020.2986024
– ident: ref14
  doi: 10.1109/JSAC.2019.2904348
– ident: ref19
  doi: 10.1109/TNNLS.2019.2953131
– ident: ref4
  doi: 10.1109/MCOM.2011.6069707
– ident: ref6
  doi: 10.1109/MCOM.2018.1701095
– ident: ref5
  doi: 10.1109/JIOT.2016.2584538
– year: 2019
  ident: ref1
  publication-title: State of the IoT 2018 Number of IoT devices now at 7B market accelerating
– start-page: 1273
  year: 2017
  ident: ref8
  article-title: Communication-efficient learning of deep networks from decentralized data
  publication-title: Proc Artif Intell Statist Conf (AISTATS)
– ident: ref21
  doi: 10.1109/72.883477
SSID ssj0001626684
Score 2.5770216
Snippet Federated learning (FL) enables resource-constrained edge nodes to collaboratively learn a global model under the orchestration of a central server while...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1078
SubjectTerms Adaptation models
Algorithms
Collaborative work
Communication
communication efficiency
Convergence
Data models
Datasets
Distributed databases
Empirical analysis
Federated learning
Internet of Things
mobile edge computing
Nodes
Servers
Training
Weighting
Title Fast-Convergent Federated Learning With Adaptive Weighting
URI https://ieeexplore.ieee.org/document/9442814
https://www.proquest.com/docview/2607876108
Volume 7
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT8IwFG6Qkx78hUYUzQ6ejJvb2m2tN7JIiAmcIHBburVTowEi4-Jf73tdQaLGeFuatun62r7vvb5-j5BrUAt5FCnpclZKdN0oV6iydGGPg4YVQlFDmT8Yxv0xe5xG0wa53byF0Vqb4DPt4ae5y1fzYoWusjvBACxj1uodMNzqt1pf_hRA5jFn9uIy8MXdKE2HYACGgUd9zhjmNNpSPSaXyo8D2GiV3gEZrMdTB5O8eqsq94qPb1SN_x3wIdm38NLp1uvhiDT07JjsbZEOtsh9Ty4rN8Voc3x4WTk95JMAyKkcS7b65Exeqmenq-QCD0NnYtynUH5Cxr2HUdp3bQYFtwA1Xrmh5DRWpSpjWpQFgEGZlFpiEr9AUsWDSOVgz-Q8kWEJ5qmm3FeKJQmAKpkIsFFPSXM2n-kz4mgw7LhPFY0lsvRRAaovCaETGbEkZqJN_PXkZoWlF8csF2-ZMTN8kaE8MpRHZuXRJjebJouaW-Ovyi2c301FO7Vt0llLMLO7b5nBP8E5BMCQn__e6oLsYt91WEqHNKv3lb4EcFHlV2ZVfQKtZ8m_
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8JAEJ4QPagHX2hEUXvwZCyU7rbd9UaIBBU4QeDWbLtbNRogUi7-emfagkSN8dY0u33s7O58MzvzDcAVqoXI87SyBU8UuW60LXWS2LjGUcNKqVlGmd_r-50hfxh74xLcrHJhjDFZ8Jmp0WV2lq-n8YJcZXXJESxT1epNj5Jx82ytL48KYnNf8OLosuHI-qDV6qMJ6DZqzBGcU1WjNeWTVVP5sQVneqW9B73lF-XhJK-1RRrV4o9vZI3__eR92C0AptXMZ8QBlMzkEHbWaAfLcNtW89RuUbw5pV6mVpsYJRB0aqugW32yRi_ps9XUakbboTXKHKh4_wiG7btBq2MXNRTsGBV5artKMF8nOvFZnMQIB1WQGEVl_BqKadHwdIQWTSQC5SZooBomHK15ECCsUoFEK_UYNibTiTkBy6BpJxymma-Ip49JVH6Biw9RHg98LivgLAc3jAuCcapz8RZmhoYjQ5JHSPIIC3lU4HrVZZaza_zVuEzju2pYDG0FqksJhsX6m4f4T7gTITQUp7_3uoStzqDXDbv3_ccz2Kb35EEqVdhI3xfmHKFGGl1kM-wTYI_NBw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Fast-Convergent+Federated+Learning+With+Adaptive+Weighting&rft.jtitle=IEEE+transactions+on+cognitive+communications+and+networking&rft.au=Wu%2C+Hongda&rft.au=Wang%2C+Ping&rft.date=2021-12-01&rft.issn=2332-7731&rft.eissn=2332-7731&rft.volume=7&rft.issue=4&rft.spage=1078&rft.epage=1088&rft_id=info:doi/10.1109%2FTCCN.2021.3084406&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TCCN_2021_3084406
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2332-7731&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2332-7731&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2332-7731&client=summon