Differentially Private Normalizing Flows for Privacy-Preserving Density Estimation

Normalizing flow models have risen as a popular solution to the problem of density estimation, enabling high-quality synthetic data generation as well as exact probability density evaluation. However, in contexts where individuals are directly associated with the training data, releasing such a mode...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Waites, Chris, Cummings, Rachel
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 25.03.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Normalizing flow models have risen as a popular solution to the problem of density estimation, enabling high-quality synthetic data generation as well as exact probability density evaluation. However, in contexts where individuals are directly associated with the training data, releasing such a model raises privacy concerns. In this work, we propose the use of normalizing flow models that provide explicit differential privacy guarantees as a novel approach to the problem of privacy-preserving density estimation. We evaluate the efficacy of our approach empirically using benchmark datasets, and we demonstrate that our method substantially outperforms previous state-of-the-art approaches. We additionally show how our algorithm can be applied to the task of differentially private anomaly detection.
ISSN:2331-8422