Privacy-Preserving Constrained Quadratic Optimization With Fisher Information
Noisy (stochastic) gradient descent is used to develop privacy-preserving algorithms for solving constrained quadratic optimization problems. The variance of the error of an adversary's estimate of the parameters of the quadratic cost function based on iterates of the algorithm is related to th...
Saved in:
Published in | IEEE signal processing letters Vol. 27; pp. 545 - 549 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Noisy (stochastic) gradient descent is used to develop privacy-preserving algorithms for solving constrained quadratic optimization problems. The variance of the error of an adversary's estimate of the parameters of the quadratic cost function based on iterates of the algorithm is related to the Fisher information of the noise using the Cramér-Rao bound. This motivates using the Fisher information as a measure of privacy. Noting that the performance degradation in noisy gradient descent is proportional to the variance of the noise, a measure of utility is defined to be equal to the variance of the noise. Trade-off between privacy and utility is balanced by minimizing the Fisher information subject to a constraint on the variance of the noise. The optimal privacy-preserving noise is proved to be Gaussian, which implies that the developed privacy-preserving optimization algorithm also guarantees differential privacy. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2020.2983320 |