On-Average KL-Privacy and Its Equivalence to Generalization for Max-Entropy Mechanisms
We define On-Average KL-Privacy and present its properties and connections to differential privacy, generalization and information-theoretic quantities including max-information and mutual information. The new definition significantly weakens differential privacy, while preserving its minimal design...
Saved in:
Published in | Privacy in Statistical Databases Vol. 9867; pp. 121 - 134 |
---|---|
Main Authors | , , |
Format | Book Chapter |
Language | English |
Published |
Switzerland
Springer International Publishing AG
2016
Springer International Publishing |
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
ISBN | 9783319453804 3319453807 |
ISSN | 0302-9743 1611-3349 |
DOI | 10.1007/978-3-319-45381-1_10 |
Cover
Loading…
Summary: | We define On-Average KL-Privacy and present its properties and connections to differential privacy, generalization and information-theoretic quantities including max-information and mutual information. The new definition significantly weakens differential privacy, while preserving its minimal design features such as composition over small group and multiple queries as well as closeness to post-processing. Moreover, we show that On-Average KL-Privacy is equivalent to generalization for a large class of commonly-used tools in statistics and machine learning that samples from Gibbs distributions—a class of distributions that arises naturally from the maximum entropy principle. In addition, a byproduct of our analysis yields a lower bound for generalization error in terms of mutual information which reveals an interesting interplay with known upper bounds that use the same quantity. |
---|---|
ISBN: | 9783319453804 3319453807 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-319-45381-1_10 |