Redrawing the boundaries on purchasing data from privacy-sensitive individuals

We prove new positive and negative results concerning the existence of truthful and individually rational mechanisms for purchasing private data from individuals with unbounded and sensitive privacy preferences. We strengthen the impossibility results of Ghosh and Roth (EC 2011) by extending it to a much wider class of privacy valuations. In particular, these include privacy valuations that are based on (ε δ)-differentially private mechanisms for non-zero δ, ones where the privacy costs are measured in a per-database manner (rather than taking the worst case), and ones that do not depend on the payments made to players (which might not be observable to an adversary). To bypass this impossibility result, we study a natural special setting where individuals have monotonic privacy valuations, which captures common contexts where certain values for private data are expected to lead to higher valuations for privacy (e. g. having a particular disease). We give new mechanisms that are individually rational for all players with monotonic privacy valuations, truthful for all players whose privacy valuations are not too large, and accurate if there are not too many players with too-large privacy valuations. We also prove matching lower bounds showing that in some respects our mechanism cannot be improved significantly.

[1]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[2]  Aaron Roth,et al.  Mechanism design in large games: incentives and privacy , 2012, ITCS.

[3]  David Xiao,et al.  Is privacy compatible with truthfulness? , 2013, ITCS '13.

[4]  Kunal Talwar,et al.  On the geometry of differential privacy , 2009, STOC '10.

[5]  Aaron Roth,et al.  Buying private data at auction: the sensitive surveyor's problem , 2012, SECO.

[6]  Guy N. Rothblum,et al.  Boosting and Differential Privacy , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[7]  Stephen Chong,et al.  Truthful mechanisms for agents that value privacy , 2011, EC.

[8]  Aaron Roth,et al.  Selling privacy at auction , 2010, EC '11.

[9]  Yu-Han Lyu,et al.  Approximately optimal auctions for selling privacy when costs are correlated with data , 2012, EC '12.

[10]  Amos Beimel,et al.  Private Learning and Sanitization: Pure vs. Approximate Differential Privacy , 2013, APPROX-RANDOM.

[11]  Aaron Roth,et al.  Conducting truthful surveys, cheaply , 2012, EC '12.

[12]  Ian A. Kash,et al.  Truthful mechanisms for agents that value privacy , 2013, EC '13.

[13]  Sampath Kannan,et al.  The Exponential Mechanism for Social Welfare: Private, Truthful, and Nearly Optimal , 2012, 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science.

[14]  Kunal Talwar,et al.  Mechanism Design via Differential Privacy , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[15]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[16]  Cynthia Dwork,et al.  Differential privacy and robust statistics , 2009, STOC '09.

[17]  Aaron Roth,et al.  Take It or Leave It: Running a Survey When Privacy Comes at a Cost , 2012, WINE.

[18]  Aaron Roth,et al.  Privacy and mechanism design , 2013, SECO.

[19]  Kobbi Nissim,et al.  Privacy-aware mechanism design , 2011, EC '12.

[20]  Anindya De,et al.  Lower Bounds in Differential Privacy , 2011, TCC.

[21]  Moshe Tennenholtz,et al.  Approximately optimal mechanism design via differential privacy , 2010, ITCS '12.