A Continuous Verification Mechanism for Clients in Federated Unlearning to Defend the Right to be Forgotten
In Federated Learning (FL), the regulatory need for the "right to be forgotten" requires efficient Federated Unlearning (FU) methods, which enable FL models to unlearn appointed training data. Associating with the emergence FU, verifying the performance of FU plays a critical role in evalu...
Saved in:
Published in | Proceedings of the ... International Symposium on Parallel and Distributed Processing with Applications (Print) pp. 864 - 871 |
---|---|
Main Authors | , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
30.10.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In Federated Learning (FL), the regulatory need for the "right to be forgotten" requires efficient Federated Unlearning (FU) methods, which enable FL models to unlearn appointed training data. Associating with the emergence FU, verifying the performance of FU plays a critical role in evaluating the consistency FU methods, in case of the unexpected degradation of the FL model. Though well developed, none of the existing verification methods in FU stands for the clients who opt out of the FL process, which is a universal demand in FL. More specifically, after the clients quit the FL cooperation, they can no longer verify whether the FL model unlearns their data after the FL keeps training for several rounds. To this end, we introduce a continuous verification mechanism for FL clients, called Backdoor Attack-based Forgetting Verification (BAFV). Inspired by backdoor attack, BAFV embedded a persistent mark for the client that proposes to leave, with the intention that the client still has the right to verify of FU after leaving the FL cooperation for a relatively long period. Extensive experiments across diverse FU environments and datasets demonstrated that our method maintains the accuracy of model and provides clients with a continuous verification mechanism to defend their rights. Our code of BAFV is publicly available at: https://github.com/paper-liu/BAFV-master.git. |
---|---|
ISSN: | 2158-9208 |
DOI: | 10.1109/ISPA63168.2024.00115 |