On-Average KL-Privacy and Its Equivalence to Generalization for Max-Entropy Mechanisms

We define On-Average KL-Privacy and present its properties and connections to differential privacy, generalization and information-theoretic quantities including max-information and mutual information. The new definition significantly weakens differential privacy, while preserving its minimal design...

Full description

Saved in:
Bibliographic Details
Published inPrivacy in Statistical Databases Vol. 9867; pp. 121 - 134
Main Authors Wang, Yu-Xiang, Lei, Jing, Fienberg, Stephen E.
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2016
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319453804
3319453807
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-45381-1_10

Cover

Loading…
More Information
Summary:We define On-Average KL-Privacy and present its properties and connections to differential privacy, generalization and information-theoretic quantities including max-information and mutual information. The new definition significantly weakens differential privacy, while preserving its minimal design features such as composition over small group and multiple queries as well as closeness to post-processing. Moreover, we show that On-Average KL-Privacy is equivalent to generalization for a large class of commonly-used tools in statistics and machine learning that samples from Gibbs distributions—a class of distributions that arises naturally from the maximum entropy principle. In addition, a byproduct of our analysis yields a lower bound for generalization error in terms of mutual information which reveals an interesting interplay with known upper bounds that use the same quantity.
ISBN:9783319453804
3319453807
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-45381-1_10