Differential Privacy in distribution and instance-based noise mechanisms

In this paper, we introduce the notion of ( , δ)-differential privacy in distribution, a strong version of the existing ( , δ)-differential privacy, used to mathematically ensure that private data of an individual are protected when embedded into a queried database. In practice, such property is obtained by adding some relevant noise. Our new notion permits to simplify proofs of ( , δ) privacy for mechanisms adding noise with a continuous distribution. As a first example, we give a simple proof that the Gaussian mechanism is ( , δ)-differentially private in distribution. Using differential privacy in distribution, we then give simple conditions for an instance-based noise mechanism to be ( , δ)-differentially private. After that, we exploit these conditions to design a new ( , δ)-differentially private instance-based noise algorithm. Compare to existing ones, our algorithm have a better accuracy when used to answer a query in a differentially private manner. In particular, our algorithm does not require the computation of the so-called Smooth Sensitivity, usually used in instance-based noise algorithms, and which was proved to be NP hard to compute in some cases, namely statistics queries on some graphs. Our algorithm handles such situations and in particular some cases for which no instance-based noise mechanism were known to perform well.

[1]  Moni Naor,et al.  Our Data, Ourselves: Privacy Via Distributed Noise Generation , 2006, EUROCRYPT.

[2]  Sofya Raskhodnikova,et al.  Analyzing Graphs with Node Differential Privacy , 2013, TCC.

[3]  Rebecca N. Wright,et al.  A Differentially Private Graph Estimator , 2009, 2009 IEEE International Conference on Data Mining Workshops.

[4]  Adam D. Smith,et al.  Composition attacks and auxiliary information in data privacy , 2008, KDD.

[5]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[6]  Sofya Raskhodnikova,et al.  Smooth sensitivity and sampling in private data analysis , 2007, STOC '07.

[7]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[8]  Cynthia Dwork,et al.  Differential Privacy: A Survey of Results , 2008, TAMC.

[9]  Shiva Prasad Kasiviswanathan,et al.  On the 'Semantics' of Differential Privacy: A Bayesian Formulation , 2008, J. Priv. Confidentiality.

[10]  Aleksandar Nikolov,et al.  The geometry of differential privacy: the sparse and approximate cases , 2012, STOC '13.

[11]  Frank McSherry,et al.  Privacy integrated queries: an extensible platform for privacy-preserving data analysis , 2009, SIGMOD Conference.

[12]  Sofya Raskhodnikova,et al.  Private analysis of graph structure , 2011, Proc. VLDB Endow..

[13]  Xintao Wu,et al.  Preserving Differential Privacy in Degree-Correlation based Graph Generation , 2013, Trans. Data Priv..

[14]  Nina Mishra,et al.  Privacy via the Johnson-Lindenstrauss Transform , 2012, J. Priv. Confidentiality.

[15]  Moni Naor,et al.  Theory and Applications of Models of Computation , 2015, Lecture Notes in Computer Science.