Privacy Tradeoffs in Predictive Analytics

Online services routinely mine user data to predict user preferences, make recommendations, and place targeted ads. Recent research has demonstrated that several private user attributes (such as political affiliation, sexual orientation, and gender) can be inferred from such data. Can a privacy-cons...

Full description

Saved in:
Bibliographic Details
Main Authors Ioannidis, Stratis, Montanari, Andrea, Weinsberg, Udi, Bhagat, Smriti, Fawaz, Nadia, Taft, Nina
Format Journal Article
LanguageEnglish
Published 31.03.2014
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Online services routinely mine user data to predict user preferences, make recommendations, and place targeted ads. Recent research has demonstrated that several private user attributes (such as political affiliation, sexual orientation, and gender) can be inferred from such data. Can a privacy-conscious user benefit from personalization while simultaneously protecting her private attributes? We study this question in the context of a rating prediction service based on matrix factorization. We construct a protocol of interactions between the service and users that has remarkable optimality properties: it is privacy-preserving, in that no inference algorithm can succeed in inferring a user's private attribute with a probability better than random guessing; it has maximal accuracy, in that no other privacy-preserving protocol improves rating prediction; and, finally, it involves a minimal disclosure, as the prediction accuracy strictly decreases when the service reveals less information. We extensively evaluate our protocol using several rating datasets, demonstrating that it successfully blocks the inference of gender, age and political affiliation, while incurring less than 5% decrease in the accuracy of rating prediction.
DOI:10.48550/arxiv.1403.8084