Universally utility-maximizing privacy mechanisms

A mechanism for releasing information about a statistical database with sensitive data must resolve a trade-off between utility and privacy. Publishing fully accurate information maximizes utility while minimizing privacy, while publishing random noise accomplishes the opposite. Privacy can be rigorously quantified using the framework of differential privacy, which requires that a mechanism's output distribution is nearly the same whether or not a given database row is included or excluded. The goal of this paper is strong and general utility guarantees, subject to differential privacy. We pursue mechanisms that guarantee near-optimal utility to every potential user, independent of its side information (modeled as a prior distribution over query results) and preferences (modeled via a loss function). Our main result is: for each fixed count query and differential privacy level, there is a geometric mechanism M* -- a discrete variant of the simple and well-studied Laplace mechanism -- that is simultaneously expected loss-minimizing for every possible user, subject to the differential privacy constraint. This is an extremely strong utility guarantee: every potential user u, no matter what its side information and preferences, derives as much utility from M* as from interacting with a differentially private mechanism Mu that is optimally tailored to u. More precisely, for every user u there is an optimal mechanism Mu for it that factors into a user-independent part (the geometric mechanism M*) followed by user-specific post-processing that can be delegated to the user itself. The first part of our proof of this result characterizes the optimal differentially private mechanism for a fixed but arbitrary user in terms of a certain basic feasible solution to a linear program with constraints that encode differential privacy. The second part shows that all of the relevant vertices of this polytope (ranging over all possible users) are derivable from the geometric mechanism via suitable remappings of its range.

[1]  A. Mas-Colell,et al.  Microeconomic Theory , 1995 .

[2]  John N. Tsitsiklis,et al.  Introduction to linear optimization , 1997, Athena scientific optimization and computation series.

[3]  Jin H. Im,et al.  Privacy , 2002, Encyclopedia of Information Systems.

[4]  Irit Dinur,et al.  Revealing information while preserving privacy , 2003, PODS.

[5]  Cynthia Dwork,et al.  Privacy-Preserving Datamining on Vertically Partitioned Databases , 2004, CRYPTO.

[6]  Cynthia Dwork,et al.  Practical privacy: the SuLQ framework , 2005, PODS.

[7]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[8]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[9]  Cynthia Dwork,et al.  Wherefore art thou r3579x?: anonymized social networks, hidden patterns, and structural steganography , 2007, WWW '07.

[10]  Sofya Raskhodnikova,et al.  Smooth sensitivity and sampling in private data analysis , 2007, STOC '07.

[11]  Cynthia Dwork,et al.  The price of privacy and the limits of LP decoding , 2007, STOC '07.

[12]  Cynthia Dwork,et al.  Privacy, accuracy, and consistency too: a holistic solution to contingency table release , 2007, PODS.

[13]  Kunal Talwar,et al.  Mechanism Design via Differential Privacy , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[14]  Adam D. Smith,et al.  A Note on Differential Privacy: Defining Resistance to Arbitrary Side Information , 2008, IACR Cryptol. ePrint Arch..

[15]  A. Blum,et al.  A learning theory approach to non-interactive database privacy , 2008, STOC.

[16]  Vitaly Shmatikov,et al.  Robust De-anonymization of Large Sparse Datasets , 2008, 2008 IEEE Symposium on Security and Privacy (sp 2008).

[17]  Cynthia Dwork,et al.  Differential Privacy: A Survey of Results , 2008, TAMC.

[18]  Sofya Raskhodnikova,et al.  What Can We Learn Privately? , 2008, 2008 49th Annual IEEE Symposium on Foundations of Computer Science.

[19]  Kunal Talwar,et al.  On the geometry of differential privacy , 2009, STOC '10.

[20]  Kobbi Nissim,et al.  Impossibility of Differentially Private Universally Optimal Mechanisms , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[21]  Mukund Sundararajan,et al.  Universally optimal privacy mechanisms for minimax agents , 2010, PODS '10.

[22]  Tim Roughgarden,et al.  Interactive privacy via the median mechanism , 2009, STOC '10.