ASFGNN: Automated separated-federated graph neural network

Graph Neural Networks (GNNs) have achieved remarkable performance by taking advantage of graph data. The success of GNN models always depends on rich features and adjacent relationships. However, in practice, such data are usually isolated by different data owners (clients) and thus are likely to be...

Full description

Saved in:
Bibliographic Details
Published inPeer-to-peer networking and applications Vol. 14; no. 3; pp. 1692 - 1704
Main Authors Zheng, Longfei, Zhou, Jun, Chen, Chaochao, Wu, Bingzhe, Wang, Li, Zhang, Benyu
Format Journal Article
LanguageEnglish
Published New York Springer US 01.05.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Graph Neural Networks (GNNs) have achieved remarkable performance by taking advantage of graph data. The success of GNN models always depends on rich features and adjacent relationships. However, in practice, such data are usually isolated by different data owners (clients) and thus are likely to be Non-Independent and Identically Distributed (Non-IID). Meanwhile, considering the limited network status of data owners, hyper-parameters optimization for collaborative learning approaches is time-consuming in data isolation scenarios. To address these problems, we propose an Automated Separated-Federated Graph Neural Network (ASFGNN) learning paradigm. ASFGNN consists of two main components, i.e., the training of GNN and the tuning of hyper-parameters. Specifically, to solve the data Non-IID problem, we first propose a separated-federated GNN learning model, which decouples the training of GNN into two parts: the message passing part that is done by clients separately, and the loss computing part that is learnt by clients federally. To handle the time-consuming parameter tuning problem, we leverage Bayesian optimization technique to automatically tune the hyper-parameters of all the clients. We conduct experiments on benchmark datasets and the results demonstrate that ASFGNN significantly outperforms the naive federated GNN, in terms of both accuracy and parameter-tuning efficiency.
AbstractList Graph Neural Networks (GNNs) have achieved remarkable performance by taking advantage of graph data. The success of GNN models always depends on rich features and adjacent relationships. However, in practice, such data are usually isolated by different data owners (clients) and thus are likely to be Non-Independent and Identically Distributed (Non-IID). Meanwhile, considering the limited network status of data owners, hyper-parameters optimization for collaborative learning approaches is time-consuming in data isolation scenarios. To address these problems, we propose an Automated Separated-Federated Graph Neural Network (ASFGNN) learning paradigm. ASFGNN consists of two main components, i.e., the training of GNN and the tuning of hyper-parameters. Specifically, to solve the data Non-IID problem, we first propose a separated-federated GNN learning model, which decouples the training of GNN into two parts: the message passing part that is done by clients separately, and the loss computing part that is learnt by clients federally. To handle the time-consuming parameter tuning problem, we leverage Bayesian optimization technique to automatically tune the hyper-parameters of all the clients. We conduct experiments on benchmark datasets and the results demonstrate that ASFGNN significantly outperforms the naive federated GNN, in terms of both accuracy and parameter-tuning efficiency.
Author Wang, Li
Zhou, Jun
Wu, Bingzhe
Chen, Chaochao
Zhang, Benyu
Zheng, Longfei
Author_xml – sequence: 1
  givenname: Longfei
  surname: Zheng
  fullname: Zheng, Longfei
  organization: Ant Group
– sequence: 2
  givenname: Jun
  surname: Zhou
  fullname: Zhou, Jun
  organization: Ant Group
– sequence: 3
  givenname: Chaochao
  orcidid: 0000-0003-1419-964X
  surname: Chen
  fullname: Chen, Chaochao
  email: chaochao.ccc@antgroup.com
  organization: Ant Group
– sequence: 4
  givenname: Bingzhe
  surname: Wu
  fullname: Wu, Bingzhe
  organization: Ant Group
– sequence: 5
  givenname: Li
  surname: Wang
  fullname: Wang, Li
  organization: Ant Group
– sequence: 6
  givenname: Benyu
  surname: Zhang
  fullname: Zhang, Benyu
  organization: Ant Group
BookMark eNp9kD1PwzAQhi1UJNrCH2CKxGzwZ-x0qypakKoyALNlnHNJaZNgJ6r496QNAomh073D-9ydnhEalFUJCF1TcksJUXeRMqI5JoxiQokSeH-GhjTjKU6FJIPfLNgFGsW4ISSlXLIhmkyf54vVapJM26ba2QbyJEJtwyFhDzkcU7IOtn5PSmiD3Xaj2Vfh4xKde7uNcPUzx-h1fv8ye8DLp8XjbLrEjmvZYKWkEN5KLZ1wmQfBMilzpQVo75wEkdmMWcG1f8sJMOuEsgBgIVfeArV8jG76vXWoPluIjdlUbSi7k4ZJyrhOFUm7lu5bLlQxBvDGFY1tiqpsgi22hhJzMGV6U6YzZY6mzL5D2T-0DsXOhq_TEO-h2JXLNYS_r05Q313Tfjc
CitedBy_id crossref_primary_10_1145_3575637_3575644
crossref_primary_10_1109_TBDATA_2024_3356388
crossref_primary_10_1109_TITS_2022_3210559
crossref_primary_10_1109_TPDS_2021_3125565
crossref_primary_10_1007_s10115_022_01664_x
crossref_primary_10_1080_09540091_2021_1936455
crossref_primary_10_1145_3560485
crossref_primary_10_1109_TMI_2022_3225083
crossref_primary_10_1016_j_knosys_2022_108609
crossref_primary_10_1109_JPROC_2024_3369017
crossref_primary_10_1109_ACCESS_2024_3476920
crossref_primary_10_1109_TKDE_2022_3178211
crossref_primary_10_1109_JSAC_2022_3229422
crossref_primary_10_1109_TNNLS_2024_3360429
crossref_primary_10_1007_s13042_024_02106_6
crossref_primary_10_1002_widm_1486
crossref_primary_10_1109_TPDS_2023_3240527
crossref_primary_10_1007_s12083_023_01584_9
crossref_primary_10_1016_j_inffus_2024_102876
crossref_primary_10_1016_j_ins_2023_119552
crossref_primary_10_1109_TMC_2024_3437468
crossref_primary_10_1109_JIOT_2022_3228727
crossref_primary_10_1007_s11633_024_1510_8
crossref_primary_10_1007_s12083_023_01472_2
crossref_primary_10_1109_ACCESS_2023_3261266
Cites_doi 10.1145/3269206.3272010
10.1145/1188913.1188915
10.1145/3219819.3219890
10.1080/03081079008935097
10.1145/359168.359176
10.1145/2623330.2623732
10.1109/SRDS51746.2020.00017
10.11989/JEST.1674-862X.80904120
10.1145/3067695.3084211
10.1016/j.jnca.2018.05.003
10.1109/BigData47090.2019.9005983
10.1145/2857705.2857731
10.1109/TNN.2008.2005605
10.1609/aaai.v33i01.33014424
ContentType Journal Article
Copyright The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021
The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021.
Copyright_xml – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021
– notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021.
DBID AAYXX
CITATION
3V.
7SC
7XB
88I
8AL
8AO
8FD
8FE
8FG
8FK
8G5
ABUWG
AFKRA
ARAPS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
GNUQQ
GUQSH
HCIFZ
JQ2
K7-
L7M
L~C
L~D
M0N
M2O
M2P
MBDVC
P5Z
P62
PHGZM
PHGZT
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
DOI 10.1007/s12083-021-01074-w
DatabaseName CrossRef
ProQuest Central (Corporate)
Computer and Information Systems Abstracts
ProQuest Central (purchase pre-March 2016)
Science Database (Alumni Edition)
Computing Database (Alumni Edition)
ProQuest Pharma Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
Research Library (Alumni)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One
ProQuest Central
ProQuest Central Student
ProQuest Research Library
SciTech Premium Collection
ProQuest Computer Science Collection
Computer Science Database
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Computing Database
ProQuest Research Library
Science Database
Research Library (Corporate)
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest Central Basic
DatabaseTitle CrossRef
Research Library Prep
Computer Science Database
ProQuest Central Student
Technology Collection
Technology Research Database
Computer and Information Systems Abstracts – Academic
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
Research Library (Alumni Edition)
ProQuest Pharma Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Central Korea
ProQuest Research Library
ProQuest Central (New)
Advanced Technologies Database with Aerospace
Advanced Technologies & Aerospace Collection
ProQuest Computing
ProQuest Science Journals (Alumni Edition)
ProQuest Central Basic
ProQuest Science Journals
ProQuest Computing (Alumni Edition)
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest SciTech Collection
Computer and Information Systems Abstracts Professional
Advanced Technologies & Aerospace Database
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
DatabaseTitleList
Research Library Prep
Database_xml – sequence: 1
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1936-6450
EndPage 1704
ExternalDocumentID 10_1007_s12083_021_01074_w
GroupedDBID -5B
-5G
-BR
-EM
-Y2
-~C
.4S
.86
.DC
06D
0R~
0VY
123
1N0
203
29O
29~
2JN
2JY
2KG
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
5VS
67Z
6NX
875
88I
8AO
8FE
8FG
8G5
8TC
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBXA
ABDZT
ABECU
ABFTD
ABFTV
ABHQN
ABJNI
ABJOX
ABKCH
ABMNI
ABMQK
ABQBU
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACGOD
ACHSB
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACZOJ
ADHHG
ADHIR
ADINQ
ADKNI
ADKPE
ADMLS
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWZB
AGYKE
AHAVH
AHBYD
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALFXC
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
ANMIH
AOCGG
ARAPS
ARCSS
AUKKA
AXYYD
AYJHY
AZQEC
B-.
BA0
BDATZ
BENPR
BGLVJ
BGNMA
BPHCQ
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DNIVK
DPUIP
DWQXO
EBLON
EBS
EIOEI
EJD
ESBYG
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GQ8
GUQSH
GXS
H13
HCIFZ
HF~
HG5
HG6
HLICF
HMJXF
HQYDN
HRMNR
HZ~
I0C
IJ-
IKXTQ
IWAJR
IXC
IXD
IZIGR
IZQ
I~X
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K6V
K7-
KOV
LLZTM
M0N
M2O
M2P
M4Y
MA-
NPVJJ
NQJWS
NU0
O9-
O93
O9J
OAM
P62
P9P
PQQKQ
PROAC
PT4
Q2X
QOS
R89
RIG
RLLFE
RNS
ROL
RPX
RSV
S16
S1Z
S27
S3B
SAP
SDH
SEG
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
T13
TH9
TSG
TSK
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W48
WK8
YLTOR
Z45
Z7X
Z83
Z88
ZMTXR
~A9
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ACSTC
AEZWR
AFDZB
AFHIU
AFOHR
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
PHGZM
PHGZT
7SC
7XB
8AL
8FD
8FK
ABRTQ
JQ2
L7M
L~C
L~D
MBDVC
PKEHL
PQEST
PQGLB
PQUKI
PRINS
Q9U
ID FETCH-LOGICAL-c385t-77544fa585c4c9fe42955d784e8fcc5e49a92a438fbd0e2ac47aeeeaed7fae1a3
IEDL.DBID U2A
ISSN 1936-6442
IngestDate Fri Jul 25 23:03:29 EDT 2025
Tue Jul 01 01:29:13 EDT 2025
Thu Apr 24 23:05:31 EDT 2025
Fri Feb 21 02:48:27 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Keywords Federated learning
Privacy preserving
Bayesian optimization
Graph neural network
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c385t-77544fa585c4c9fe42955d784e8fcc5e49a92a438fbd0e2ac47aeeeaed7fae1a3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-1419-964X
PQID 2512386706
PQPubID 54523
PageCount 13
ParticipantIDs proquest_journals_2512386706
crossref_citationtrail_10_1007_s12083_021_01074_w
crossref_primary_10_1007_s12083_021_01074_w
springer_journals_10_1007_s12083_021_01074_w
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-05-01
PublicationDateYYYYMMDD 2021-05-01
PublicationDate_xml – month: 05
  year: 2021
  text: 2021-05-01
  day: 01
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
– name: Norwell
PublicationTitle Peer-to-peer networking and applications
PublicationTitleAbbrev Peer-to-Peer Netw. Appl
PublicationYear 2021
Publisher Springer US
Springer Nature B.V
Publisher_xml – name: Springer US
– name: Springer Nature B.V
References McMahan HB, Moore E, Ramage D, y Arcas BA (2016) Federated learning of deep networks using model averaging. ArXiv:1602.05629
Finn C, Abbeel P, Levine S (2017) Model-agnostic meta-learning for fast adaptation of deep networks
Zhou J, Chen C, Zheng L, Zheng X, Wu B, Liu Z, Wang L (2020) Privacy-preserving graph neural network for node classification. arXiv:2005.11903
Wu J, Chen XY, Zhang H, Xiong LD, Lei H, Deng SH (2019) Hyperparameter optimization for machine learning models based on bayesian optimizationb. J Electron Sci Technol 17(1):26–40. https://doi.org/10.11989/JEST.1674-862X.80904120, http://www.sciencedirect.com/science/article/pii/S1674862X19300047
Chen YW, Song Q, Hu X (2019) Techniques for automated machine learning
Lorenzo PR, Nalepa J, Ramos LS, Pastor JR (2017) Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’17. Association for Computing Machinery, New York, pp 1864–1871. https://doi.org/10.1145/3067695.3084211
Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.02907
Wang T, Zhu JY, Torralba A, Efros AA (2018) Dataset distillation
Shamir A (1979) How to share a secret. Commun ACM 22(11):612–613
Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2009) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80
Abril PS, Plant R (2007) A comprehensive survey on graph neural networks. Commun ACM 50(1), 36–44. https://doi.org/10.1145/1188913.1188915
Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems. In: SIGKDD. ACM, pp 974–983
Aono Y, Hayashi T, Trieu Phong L, Wang L (2016) Scalable and secure logistic regression via homomorphic encryption. In: CODASPY. ACM, pp 142–144
Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V (2018) Federated learning with non-iid data
Lindauer M, Eggensperger K, Feurer M, Falkner S, Biedenkapp A, Hutter F (2017) Smac v3: Algorithm configuration in python. https://github.com/automl/SMAC3
Gu Z, Huang H, Zhang J, Su D, Lamba A, Pendarakis D, Molloy I (2019) Securing input data of deep learning inference systems via partitioned enclave execution. CoRR arXiv:1807.00969
Mei G, Guo Z, Liu S, Pan L (2019) Sgnn: A graph neural network based federated learning approach by hiding structure. In: 2019 IEEE International Conference on Big Data (Big Data). IEEE, pp 2560–2568
Bojchevski A, Günnemann S (2017) Deep gaussian embedding of attributed graphs: Unsupervised inductive learning via ranking. arXiv:1707.03815
Liu Z, Chen C, Li L, Zhou J, Li X, Song L, Qi Y (2018) Geniepath: Graph neural networks with adaptive receptive paths
Yu T, Zhu H (2020) Hyper-parameter optimization: A review of algorithms and applications. arXiv:2003.05689
Chen C, Zhou J, Wang L, Wu X, Fang W, Tan J, Wang L, Ji X, Liu A, Wang H (2020) When homomorphic encryption marries secret sharing: Secure large-scale sparse logistic regression and applications in risk control. arXiv:2008.08753
Abuadbba S, Kim K, Kim M, Thapa C, Camtepe SA, Gao Y, Kim H, Nepal S (2020) Can we use split learning on 1d cnn models for privacy preserving training? arXiv:2003.12365
Lin J, Wong SKM (1990) A new directed divergence measure and its characterization. Int J Gen Syst 17(1)L73–81.
Wang Y, Sun Y, Liu Z, Sarma SE, Bronstein MM, Solomon JM (2018) Dynamic graph cnn for learning on point clouds. arXiv:1801.07829
McMahan HB, Moore E, Ramage D, Hampson S, y Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: AISTATS
Thakkar O, Andrew G, McMahan HB (2019) Differentially private learning with adaptive clipping
Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: NeurIPS, pp 1024–1034
Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14. Association for Computing Machinery, New York, pp 701–710. https://doi.org/10.1145/2623330.2623732
Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. CoRR arXiv:1609.02907
Swersky K, Snoek J, Adams RP (2013) Multi-task bayesian optimization. In: Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 2, NIPS’13. Curran Associates Inc, Red Hook, pp 2004–2012
Kairouz P, McMahan H.B, Avent B, Bellet A, Bennis M, Bhagoji A.N, Bonawitz K, Charles Z, Cormode G, Cummings R, D’Oliveira RGL, Rouayheb SE, Evans D, Gardner J, Garrett Z, Gascón A, Ghazi B, Gibbons PB, Gruteser M, Harchaoui Z, He C, He L, Huo Z, Hutchinson B, Hsu J, Jaggi M, Javidi T, Joshi G, Khodak M, Konečný J, Korolova A, Koushanfar F, Koyejo S, Lepoint T, Liu Y, Mittal P, Mohri M, Nock R, Özgür A, Pagh R, Raykova M, Qi H, Ramage D, Raskar R, Song D, Song W, Stich SU, Sun Z, Suresh AT, Tramèr F, Vepakomma P, Wang J, Xiong L, Xu Z, Yang Q, Yu FX, Yu H, Zhao S (2019) Advances and open problems in federated learning
Liu Z, Chen C, Yang X, Zhou J, Li X, Song L (2018) Heterogeneous graph neural networks for malicious account detection. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, CIKM ’18. Association for Computing Machinery, New York, pp 2077–2085. https://doi.org/10.1145/3269206.3272010
Gupta O, Raskar R (2018) Distributed learning of deep neural network over multiple agents. J Netw Comput Appl 116:1–8
Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. https://www.tensorflow.org/. Software available from tensorflow.org
Gao Y, Kim M, Abuadbba S, Kim Y, Thapa C, Kim K, Camtepe SA, Kim H, Nepal S (2020) End-to-end evaluation of federated learning and split learning for internet of things. arXiv:2003.13376
Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2017) Graph attention networks
1074_CR28
1074_CR29
1074_CR2
1074_CR26
1074_CR3
1074_CR27
1074_CR4
1074_CR24
1074_CR5
1074_CR25
1074_CR6
1074_CR22
1074_CR7
1074_CR23
1074_CR8
1074_CR20
1074_CR9
1074_CR21
1074_CR19
1074_CR17
1074_CR18
1074_CR15
1074_CR16
1074_CR13
1074_CR35
1074_CR14
1074_CR36
1074_CR11
1074_CR33
1074_CR12
1074_CR34
1074_CR31
1074_CR10
1074_CR32
1074_CR30
1074_CR1
References_xml – reference: Chen YW, Song Q, Hu X (2019) Techniques for automated machine learning
– reference: McMahan HB, Moore E, Ramage D, Hampson S, y Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: AISTATS
– reference: Shamir A (1979) How to share a secret. Commun ACM 22(11):612–613
– reference: Finn C, Abbeel P, Levine S (2017) Model-agnostic meta-learning for fast adaptation of deep networks
– reference: Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. CoRR arXiv:1609.02907
– reference: Abril PS, Plant R (2007) A comprehensive survey on graph neural networks. Commun ACM 50(1), 36–44. https://doi.org/10.1145/1188913.1188915
– reference: Lindauer M, Eggensperger K, Feurer M, Falkner S, Biedenkapp A, Hutter F (2017) Smac v3: Algorithm configuration in python. https://github.com/automl/SMAC3
– reference: Liu Z, Chen C, Li L, Zhou J, Li X, Song L, Qi Y (2018) Geniepath: Graph neural networks with adaptive receptive paths
– reference: Wu J, Chen XY, Zhang H, Xiong LD, Lei H, Deng SH (2019) Hyperparameter optimization for machine learning models based on bayesian optimizationb. J Electron Sci Technol 17(1):26–40. https://doi.org/10.11989/JEST.1674-862X.80904120, http://www.sciencedirect.com/science/article/pii/S1674862X19300047
– reference: Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems. In: SIGKDD. ACM, pp 974–983
– reference: Liu Z, Chen C, Yang X, Zhou J, Li X, Song L (2018) Heterogeneous graph neural networks for malicious account detection. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, CIKM ’18. Association for Computing Machinery, New York, pp 2077–2085. https://doi.org/10.1145/3269206.3272010
– reference: Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: NeurIPS, pp 1024–1034
– reference: Lin J, Wong SKM (1990) A new directed divergence measure and its characterization. Int J Gen Syst 17(1)L73–81.
– reference: Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2009) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80
– reference: Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2017) Graph attention networks
– reference: Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V (2018) Federated learning with non-iid data
– reference: McMahan HB, Moore E, Ramage D, y Arcas BA (2016) Federated learning of deep networks using model averaging. ArXiv:1602.05629
– reference: Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. https://www.tensorflow.org/. Software available from tensorflow.org
– reference: Bojchevski A, Günnemann S (2017) Deep gaussian embedding of attributed graphs: Unsupervised inductive learning via ranking. arXiv:1707.03815
– reference: Chen C, Zhou J, Wang L, Wu X, Fang W, Tan J, Wang L, Ji X, Liu A, Wang H (2020) When homomorphic encryption marries secret sharing: Secure large-scale sparse logistic regression and applications in risk control. arXiv:2008.08753
– reference: Wang Y, Sun Y, Liu Z, Sarma SE, Bronstein MM, Solomon JM (2018) Dynamic graph cnn for learning on point clouds. arXiv:1801.07829
– reference: Swersky K, Snoek J, Adams RP (2013) Multi-task bayesian optimization. In: Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 2, NIPS’13. Curran Associates Inc, Red Hook, pp 2004–2012
– reference: Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.02907
– reference: Lorenzo PR, Nalepa J, Ramos LS, Pastor JR (2017) Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’17. Association for Computing Machinery, New York, pp 1864–1871. https://doi.org/10.1145/3067695.3084211
– reference: Zhou J, Chen C, Zheng L, Zheng X, Wu B, Liu Z, Wang L (2020) Privacy-preserving graph neural network for node classification. arXiv:2005.11903
– reference: Kairouz P, McMahan H.B, Avent B, Bellet A, Bennis M, Bhagoji A.N, Bonawitz K, Charles Z, Cormode G, Cummings R, D’Oliveira RGL, Rouayheb SE, Evans D, Gardner J, Garrett Z, Gascón A, Ghazi B, Gibbons PB, Gruteser M, Harchaoui Z, He C, He L, Huo Z, Hutchinson B, Hsu J, Jaggi M, Javidi T, Joshi G, Khodak M, Konečný J, Korolova A, Koushanfar F, Koyejo S, Lepoint T, Liu Y, Mittal P, Mohri M, Nock R, Özgür A, Pagh R, Raykova M, Qi H, Ramage D, Raskar R, Song D, Song W, Stich SU, Sun Z, Suresh AT, Tramèr F, Vepakomma P, Wang J, Xiong L, Xu Z, Yang Q, Yu FX, Yu H, Zhao S (2019) Advances and open problems in federated learning
– reference: Abuadbba S, Kim K, Kim M, Thapa C, Camtepe SA, Gao Y, Kim H, Nepal S (2020) Can we use split learning on 1d cnn models for privacy preserving training? arXiv:2003.12365
– reference: Mei G, Guo Z, Liu S, Pan L (2019) Sgnn: A graph neural network based federated learning approach by hiding structure. In: 2019 IEEE International Conference on Big Data (Big Data). IEEE, pp 2560–2568
– reference: Gao Y, Kim M, Abuadbba S, Kim Y, Thapa C, Kim K, Camtepe SA, Kim H, Nepal S (2020) End-to-end evaluation of federated learning and split learning for internet of things. arXiv:2003.13376
– reference: Wang T, Zhu JY, Torralba A, Efros AA (2018) Dataset distillation
– reference: Thakkar O, Andrew G, McMahan HB (2019) Differentially private learning with adaptive clipping
– reference: Aono Y, Hayashi T, Trieu Phong L, Wang L (2016) Scalable and secure logistic regression via homomorphic encryption. In: CODASPY. ACM, pp 142–144
– reference: Gu Z, Huang H, Zhang J, Su D, Lamba A, Pendarakis D, Molloy I (2019) Securing input data of deep learning inference systems via partitioned enclave execution. CoRR arXiv:1807.00969
– reference: Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14. Association for Computing Machinery, New York, pp 701–710. https://doi.org/10.1145/2623330.2623732
– reference: Gupta O, Raskar R (2018) Distributed learning of deep neural network over multiple agents. J Netw Comput Appl 116:1–8
– reference: Yu T, Zhu H (2020) Hyper-parameter optimization: A review of algorithms and applications. arXiv:2003.05689
– ident: 1074_CR7
– ident: 1074_CR19
  doi: 10.1145/3269206.3272010
– ident: 1074_CR2
  doi: 10.1145/1188913.1188915
– ident: 1074_CR22
– ident: 1074_CR3
– ident: 1074_CR1
– ident: 1074_CR5
– ident: 1074_CR29
– ident: 1074_CR27
– ident: 1074_CR33
  doi: 10.1145/3219819.3219890
– ident: 1074_CR16
  doi: 10.1080/03081079008935097
– ident: 1074_CR26
  doi: 10.1145/359168.359176
– ident: 1074_CR34
– ident: 1074_CR36
– ident: 1074_CR13
– ident: 1074_CR30
– ident: 1074_CR6
– ident: 1074_CR15
– ident: 1074_CR17
– ident: 1074_CR24
  doi: 10.1145/2623330.2623732
– ident: 1074_CR9
  doi: 10.1109/SRDS51746.2020.00017
– ident: 1074_CR32
  doi: 10.11989/JEST.1674-862X.80904120
– ident: 1074_CR20
  doi: 10.1145/3067695.3084211
– ident: 1074_CR8
– ident: 1074_CR11
  doi: 10.1016/j.jnca.2018.05.003
– ident: 1074_CR21
– ident: 1074_CR23
  doi: 10.1109/BigData47090.2019.9005983
– ident: 1074_CR28
– ident: 1074_CR4
  doi: 10.1145/2857705.2857731
– ident: 1074_CR25
  doi: 10.1109/TNN.2008.2005605
– ident: 1074_CR10
– ident: 1074_CR18
  doi: 10.1609/aaai.v33i01.33014424
– ident: 1074_CR35
– ident: 1074_CR31
– ident: 1074_CR14
– ident: 1074_CR12
SSID ssj0061352
Score 2.4796464
Snippet Graph Neural Networks (GNNs) have achieved remarkable performance by taking advantage of graph data. The success of GNN models always depends on rich features...
SourceID proquest
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1692
SubjectTerms Automation
Clients
Communications Engineering
Computer Communication Networks
Engineering
Graph neural networks
Information Systems and Communication Service
Learning
Mathematical models
Message passing
Networks
Neural networks
Optimization
Optimization techniques
Parameters
Signal,Image and Speech Processing
Special Issue on Privacy-Preserving Computing
Training
Tuning
SummonAdditionalLinks – databaseName: ProQuest Central
  dbid: BENPR
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV09T8MwED1Bu8CA-BSFgjKwgUWaOInTBRXUUiERIaBSt8ixzxNKC03Vv4_tOBSQYM3HDc_JvSf77h3ARSx9X8hQES4wJDRBRQrOFAkiFksqk0gK0-_8mMXjCX2YRlO34bZwZZVNTrSJWs6E2SO_Njwcsjjx45v5OzFTo8zpqhuhsQltnYIZa0H7dpg9PTe5WHOVnbmjVUpMNPMHrm2mbp4LtPwgpkTBt1WJq5_UtNabv45ILfOMdmHHSUZvUK_xHmxguQ_b34wED6A_eBndZ1nfGyyrmdagKL0FWldvlEQZvwh7zbpTe8bCUscr6wLwQ5iMhq93Y-KmIhARsqgi1rJOcS3zBRWpQk0oUSQTRpEpISKkKU8DTkOmCuljwAVNOCJylIni2OPhEbTKWYnH4FGGimGR8JQp2hPI_TDQctDHouhxrRs70GsAyYWzDDeTK97ytdmxATHXIOYWxHzVgcuvd-a1Yca_T3cbnHP38yzy9VJ34KrBfn3772gn_0c7ha3ALrcpV-xCq_pY4pmWFFVx7r6bT_a4yGI
  priority: 102
  providerName: ProQuest
Title ASFGNN: Automated separated-federated graph neural network
URI https://link.springer.com/article/10.1007/s12083-021-01074-w
https://www.proquest.com/docview/2512386706
Volume 14
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwGA26XfQg_sTpHD1400B_pG26W9V1Q10R52CeSpp8OUknrmP_vknWWhUVPAXa9KO8pPleyfteEDoPhG1z4UnMOHiYhCBxzqjErk8DQUToC67rncdpMJqS25k_q4rCFrXavd6SNCt1U-zmKrqAtaTANirC1SZq--rfXQu5pm5cr78qP5lzdhQzCbDK9m5VKvNzjK_pqOGY37ZFTbZJdtFORROteD2ue2gDin20_ck88AD140kyTNO-FS_LueKdIKwFGCdvEFhqjwhzzThSW9q2UsUr1qLvQzRNBk_XI1ydhIC5R_0SG5s6yRS154RHElQS8X0RUgJUcu4DiVjkMuJRmQsbXMZJyACAgQglA4d5R6hVzAs4RhahICnkIYuoJA4HZnuuooA25LnDFFfsIKcGJOOVTbg-reIlawyONYiZAjEzIGarDrr4eOZ1bZLxZ-9ujXNWfTCLTNMsjwahrV7gssa-uf17tJP_dT9FW64Zfi1Z7KJW-baEM0UryryHNmky7KF2fDO-n-h2-Hw3UO3VIH147Jk59g61IcoB
linkProvider Springer Nature
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3JTsMwEB2xHIADYhVlzQFOYJE6TuIiIVQBXYD2Aki9Bccen1BbaKuKn-IbsZ2EAhK99ZplDs_jmZd45g3AcaR8X6pAEyExICxGTVLBNaEhjxRTcaik7XdutaPGM7vrhJ05-Cx6YWxZZRETXaBWPWn_kZ_bPBzwKPajq_4bsVOj7OlqMUIjc4t7_BibT7bBZfPGrO8JpbXbp-sGyacKEBnwcEic5JsWhiZLJisaTUAOQxVzhlxLGSKriAoVLOA6VT5SIVksEFGgirXAsgiM3XlYZIHJ5LYzvVYvIr_JjG7Cj-FEETE8g-ZNOlmrHjVkh9iCCN_VQI5_J8IJu_1zIOvyXG0NVnOC6lUzj1qHOexuwMoP2cJNuKg-1urt9oVXHQ17hvGi8gboNMRREW3VKdw1p4XtWcFMY6-blZtvwfNM0NqGhW6vizvgMY6aYxqLCtesLFH4ATXk08c0LQvDUktQLgBJZC5QbudkvCYTaWULYmJATByIybgEp9_v9DN5jqlP7xc4J_lWHSQTxyrBWYH95Pb_1nanWzuCpcZT6yF5aLbv92CZuqW3hZL7sDB8H-GBITPD9NB5kAcvs3bZL51DBpM
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3JTsMwEB1BkRAcEKsoaw5wAovUcRIXCaGylD1CLBK34NjjE2oLLar4Nb4O20koINFbr1nm8DyZeY5n3gBsRcr3pQo0ERIDwmLUJBNcExrySDEVh0rafuebJDp_ZJdP4dMYfJa9MLassoyJLlCrtrT_yPdsHg54FPvRni7KIm5PmoedV2InSNmT1nKcRu4iV_jRN9u37sHFiVnrbUqbpw_H56SYMEBkwMMecfJvWhjKLJmsazTBOQxVzBlyLWWIrC7qVLCA60z5SIVksUBEgSrWAmsiMHbHYSK2u6IKTBydJrd3ZR4wedLN-zEMKSKGddCiZSdv3KOG-hBbHuG7isj-77Q44Lp_jmdd1mvOwkxBV71G7l9zMIateZj-IWK4APuN--ZZkux7jfde2_BfVF4XnaI4KqKtVoW75pSxPSufaey18uLzRXgcCV5LUGm1W7gMHuOoOWaxqHPNahKFH1BDRX3MspownLUKtRKQVBZy5XZqxks6EFq2IKYGxNSBmParsPP9TicX6xj69FqJc1p8uN104GZV2C2xH9z-39rKcGubMGncNb2-SK5WYYq6lbdVk2tQ6b2947phNr1so3AhD55H7bVfI9gMJQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ASFGNN%3A+Automated+separated-federated+graph+neural+network&rft.jtitle=Peer-to-peer+networking+and+applications&rft.au=Zheng%2C+Longfei&rft.au=Zhou%2C+Jun&rft.au=Chen%2C+Chaochao&rft.au=Wu%2C+Bingzhe&rft.date=2021-05-01&rft.pub=Springer+US&rft.issn=1936-6442&rft.eissn=1936-6450&rft.volume=14&rft.issue=3&rft.spage=1692&rft.epage=1704&rft_id=info:doi/10.1007%2Fs12083-021-01074-w&rft.externalDocID=10_1007_s12083_021_01074_w
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1936-6442&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1936-6442&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1936-6442&client=summon