Federated Learning Scheme Based on Secure Multi-party Computation and Differential Privacy

Federated learning provides a novel solution to collaborative learning among untrusted entities. Through a local-trai-ning-and-central-aggregation pattern, the federated learning algorithm trains a global model while protects local data privacy of each entity. However, recent studies show that local...

Full description

Saved in:
Bibliographic Details
Published inJi suan ji ke xue Vol. 49; no. 9; pp. 297 - 305
Main Authors Tang, Ling-tao, Wang, Di, Zhang, Lu-fei, Liu, Sheng-yun
Format Journal Article
LanguageChinese
Published Chongqing Guojia Kexue Jishu Bu 01.09.2022
Editorial office of Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Federated learning provides a novel solution to collaborative learning among untrusted entities. Through a local-trai-ning-and-central-aggregation pattern, the federated learning algorithm trains a global model while protects local data privacy of each entity. However, recent studies show that local models uploaded by clients and global models produced by the server may still leak users' private information. Secure multi-party computation and differential privacy are two mainstream privacy-preserving techniques, which are used to protect the privacy of computation process and computation outputs respectively. There are few works that exploit the benefits of these two techniques at the same time. This paper proposes a privacy-preserving federated learning scheme for deep learning by combining secure multi-party computation and differential privacy. Clients add noise to local models, and secret share them to multiple servers. Servers aggregate these model shares by secure multi-party computation to obtain a pri
ISSN:1002-137X
DOI:10.11896/jsjkx.210800108