Development and Analysis of Deterministic Privacy-Preserving Policies Using Non-Stochastic Information Theory
A deterministic privacy metric using non-stochastic information theory is developed. Particularly, minimax information is used to construct a measure of information leakage, which is inversely proportional to the measure of privacy. Anyone can submit a query to a trusted agent with access to a non-s...
Saved in:
Published in | arXiv.org |
---|---|
Main Author | |
Format | Paper Journal Article |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
23.01.2019
|
Subjects | |
Online Access | Get full text |
ISSN | 2331-8422 |
DOI | 10.48550/arxiv.1810.11153 |
Cover
Loading…
Summary: | A deterministic privacy metric using non-stochastic information theory is developed. Particularly, minimax information is used to construct a measure of information leakage, which is inversely proportional to the measure of privacy. Anyone can submit a query to a trusted agent with access to a non-stochastic uncertain private dataset. Optimal deterministic privacy-preserving policies for responding to the submitted query are computed by maximizing the measure of privacy subject to a constraint on the worst-case quality of the response (i.e., the worst-case difference between the response by the agent and the output of the query computed on the private dataset). The optimal privacy-preserving policy is proved to be a piecewise constant function in the form of a quantization operator applied on the output of the submitted query. The measure of privacy is also used to analyze the performance of \(k\)-anonymity methodology (a popular deterministic mechanism for privacy-preserving release of datasets using suppression and generalization techniques), proving that it is in fact not privacy-preserving. |
---|---|
Bibliography: | SourceType-Working Papers-1 ObjectType-Working Paper/Pre-Print-1 content type line 50 |
ISSN: | 2331-8422 |
DOI: | 10.48550/arxiv.1810.11153 |