Interpreting Epsilon of Differential Privacy in Terms of Advantage in Guessing or Approximating Sensitive Attributes

Differential privacy is a privacy technique with provable guarantees which is typically achieved by introducing noise to statistics before releasing them. The level of privacy is characterized by a certain numeric parameter E > 0, where smaller E means more privacy. However, there is no common ag...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE 35th Computer Security Foundations Symposium (CSF) pp. 96 - 111
Main Authors Pankova, Alisa, Laud, Peeter
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Differential privacy is a privacy technique with provable guarantees which is typically achieved by introducing noise to statistics before releasing them. The level of privacy is characterized by a certain numeric parameter E > 0, where smaller E means more privacy. However, there is no common agreement on how small E should be, and the actual likelihood of data leakage for the same E may vary for different released statistics and different datasets. In this paper, we show how to relate E to the increase in the probability of attacker's success in guessing something about the private data. The attacker's goal is stated as a Boolean expression over guessing particular categorical and numerical attributes, where numeric attributes can be guessed with some precision. The paper is built upon the definition of d-privacy, which is a gencralization of E-differential privacy.
DOI:10.1109/CSF54842.2022.9919656