Privacy-Preserving Gradient-Descent Methods

Gradient descent is a widely used paradigm for solving many optimization problems. Gradient descent aims to minimize a target function in order to reach a local minimum. In machine learning or data mining, this function corresponds to a decision model that is to be discovered. In this paper, we prop...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on knowledge and data engineering Vol. 22; no. 6; pp. 884 - 899
Main Authors Shuguo Han, Wee Keong Ng, Li Wan, Lee, Vincent C S
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.06.2010
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gradient descent is a widely used paradigm for solving many optimization problems. Gradient descent aims to minimize a target function in order to reach a local minimum. In machine learning or data mining, this function corresponds to a decision model that is to be discovered. In this paper, we propose a preliminary formulation of gradient descent with data privacy preservation. We present two approaches-stochastic approach and least square approach-under different assumptions. Four protocols are proposed for the two approaches incorporating various secure building blocks for both horizontally and vertically partitioned data. We conduct experiments to evaluate the scalability of the proposed secure building blocks and the accuracy and efficiency of the protocols for four different scenarios. The excremental results show that the proposed secure building blocks are reasonably scalable and the proposed protocols allow us to determine a better secure protocol for the applications for each scenario.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2009.153