Confidential Federated Computations
Federated Learning and Analytics (FLA) have seen widespread adoption by technology platforms for processing sensitive on-device data. However, basic FLA systems have privacy limitations: they do not necessarily require anonymization mechanisms like differential privacy (DP), and provide limited prot...
Saved in:
Main Authors | , , , , , , , , , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
16.04.2024
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.2404.10764 |
Cover
Summary: | Federated Learning and Analytics (FLA) have seen widespread adoption by
technology platforms for processing sensitive on-device data. However, basic
FLA systems have privacy limitations: they do not necessarily require
anonymization mechanisms like differential privacy (DP), and provide limited
protections against a potentially malicious service provider. Adding DP to a
basic FLA system currently requires either adding excessive noise to each
device's updates, or assuming an honest service provider that correctly
implements the mechanism and only uses the privatized outputs. Secure
multiparty computation (SMPC) -based oblivious aggregations can limit the
service provider's access to individual user updates and improve DP tradeoffs,
but the tradeoffs are still suboptimal, and they suffer from scalability
challenges and susceptibility to Sybil attacks. This paper introduces a novel
system architecture that leverages trusted execution environments (TEEs) and
open-sourcing to both ensure confidentiality of server-side computations and
provide externally verifiable privacy properties, bolstering the robustness and
trustworthiness of private federated computations. |
---|---|
DOI: | 10.48550/arxiv.2404.10764 |