Privacy against statistical inference
We propose a general statistical inference framework to capture the privacy threat incurred by a user that releases data to a passive but curious adversary, given utility constraints. We show that applying this general framework to the setting where the adversary uses the self-information cost funct...
Saved in:
Published in | 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton) pp. 1401 - 1408 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.10.2012
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We propose a general statistical inference framework to capture the privacy threat incurred by a user that releases data to a passive but curious adversary, given utility constraints. We show that applying this general framework to the setting where the adversary uses the self-information cost function naturally leads to a non-asymptotic information-theoretic approach for characterizing the best achievable privacy subject to utility constraints. Based on these results we introduce two privacy metrics, namely average information leakage and maximum information leakage. We prove that under both metrics the resulting design problem of finding the optimal mapping from the user's data to a privacy-preserving output can be cast as a modified rate-distortion problem which, in turn, can be formulated as a convex program. Finally, we compare our framework with differential privacy. |
---|---|
ISBN: | 9781467345378 1467345377 |
DOI: | 10.1109/Allerton.2012.6483382 |