Protecting Sensitive Attributes by Adversarial Training through Class-Overlapping Techniques

In recent years, machine learning as a service (MLaaS) has brought considerable convenience to our daily lives. However, these services raise the issue of leaking users' sensitive attributes, such as race, when provided through the cloud. The present work overcomes this issue by proposing an in...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information forensics and security Vol. 18; p. 1
Main Authors Lin, Tsung-Hsien, Lee, Ying-Shuo, Chang, Fu-Chieh, Morris Chang, J., Wu, Pei-Yuan
Format Journal Article
LanguageEnglish
Published New York IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In recent years, machine learning as a service (MLaaS) has brought considerable convenience to our daily lives. However, these services raise the issue of leaking users' sensitive attributes, such as race, when provided through the cloud. The present work overcomes this issue by proposing an innovative privacy-preserving approach called privacy-preserving class overlap (PPCO), which incorporates both a Wasserstein generative adversarial network and the idea of class overlapping to obfuscate data for better resilience against the leakage of attribute-inference attacks(i.e., malicious inference on users' sensitive attributes). Experiments show that the proposed method can be employed to enhance current state-of-the-art works and achieve superior privacy-utility trade-off. Furthermore, the proposed method is shown to be less susceptible to the influence of imbalanced classes in training data. Finally, we provide a theoretical analysis of the performance of our proposed method to give a flavour of the gap between theoretical and empirical performances.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1556-6013
1556-6021
DOI:10.1109/TIFS.2023.3236180