Privacy Auctions for Recommender Systems

We study a market for private data in which a data analyst publicly releases a statistic over a database of private information. Individuals that own the data incur a cost for their loss of privacy proportional to the differential privacy guarantee given by the analyst at the time of the release. The analyst incentivizes individuals by compensating them, giving rise to a privacy auction. Motivated by recommender systems, the statistic we consider is a linear predictor function with publicly known weights. The statistic can be viewed as a prediction of the unknown data of a new individual, based on the data of individuals in the database. We formalize the trade-off between privacy and accuracy in this setting, and show that a simple class of estimates achieves an order-optimal trade-off. It thus suffices to focus on auction mechanisms that output such estimates. We use this observation to design a truthful, individually rational, proportional-purchase mechanism under a fixed budget constraint. We show that our mechanism is 5-approximate in terms of accuracy compared to the optimal mechanism, and that no truthful mechanism can achieve a 2−e approximation, for any e>0.

[1]  Greg Linden,et al.  Amazon . com Recommendations Item-to-Item Collaborative Filtering , 2001 .

[2]  John Riedl,et al.  Item-based collaborative filtering recommendation algorithms , 2001, WWW '01.

[3]  Stephen Chong,et al.  Truthful mechanisms for agents that value privacy , 2011, EC.

[4]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[5]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[6]  Roger B. Myerson,et al.  Optimal Auction Design , 1981, Math. Oper. Res..

[7]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[8]  Daniel A. Spielman,et al.  Spectral Graph Theory and its Applications , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[9]  Yaron Singer,et al.  Budget Feasible Mechanisms , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[10]  Nina Mishra,et al.  Releasing search queries and clicks privately , 2009, WWW '09.

[11]  Ilya Mironov,et al.  Differentially private recommender systems: building privacy into the net , 2009, KDD.

[12]  Kunal Talwar,et al.  Mechanism Design via Differential Privacy , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[13]  Arvind Narayanan,et al.  Do Not Track: A Universal Third-Party Web Tracking Opt Out , 2011 .

[14]  J. Turow,et al.  Americans Reject Tailored Advertising and Three Activities that Enable It , 2009 .

[15]  Aaron Roth,et al.  Selling privacy at auction , 2010, EC '11.

[16]  Paolo Toth,et al.  Knapsack Problems: Algorithms and Computer Implementations , 1990 .

[17]  Kobbi Nissim,et al.  Privacy-aware mechanism design , 2011, EC '12.

[18]  Aaron Roth,et al.  Take It or Leave It: Running a Survey When Privacy Comes at a Cost , 2012, WINE.

[19]  Aaron Roth,et al.  Conducting truthful surveys, cheaply , 2012, EC '12.

[20]  David Xiao,et al.  Is privacy compatible with truthfulness? , 2013, ITCS '13.

[21]  Vitaly Shmatikov,et al.  Robust De-anonymization of Large Sparse Datasets , 2008, 2008 IEEE Symposium on Security and Privacy (sp 2008).

[22]  Moshe Tennenholtz,et al.  Approximately optimal mechanism design via differential privacy , 2010, ITCS '12.

[23]  Yu-Han Lyu,et al.  Approximately optimal auctions for selling privacy when costs are correlated with data , 2012, EC '12.