Privacy-Preserving Federated Graph Neural Network Learning on Non-IID Graph Data
Since the concept of federated learning (FL) was proposed by Google in 2017, many applications have been combined with FL technology due to its outstanding performance in data integration, computing performance, privacy protection, etc. However, most traditional federated learning-based applications...
Saved in:
Published in | Wireless communications and mobile computing Vol. 2023; pp. 1 - 13 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Oxford
Hindawi
2023
Hindawi Limited |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Since the concept of federated learning (FL) was proposed by Google in 2017, many applications have been combined with FL technology due to its outstanding performance in data integration, computing performance, privacy protection, etc. However, most traditional federated learning-based applications focus on image processing and natural language processing with few achievements in graph neural networks due to the graph’s nonindependent identically distributed (IID) nature. Representation learning on graph-structured data generates graph embedding, which helps machines understand graphs effectively. Meanwhile, privacy protection plays a more meaningful role in analyzing graph-structured data such as social networks. Hence, this paper proposes PPFL-GNN, a novel privacy-preserving federated graph neural network framework for node representation learning, which is a pioneer work for graph neural network-based federated learning. In PPFL-GNN, clients utilize a local graph dataset to generate graph embeddings and integrate information from other collaborative clients to utilize federated learning to produce more accurate representation results. More importantly, by integrating embedding alignment techniques in PPFL-GNN, we overcome the obstacles of federated learning on non-IID graph data and can further reduce privacy exposure by sharing preferred information. |
---|---|
ISSN: | 1530-8669 1530-8677 |
DOI: | 10.1155/2023/8545101 |