Information Theoretic Analysis of Privacy in a Multiple Query-Response Based Differentially Private Framework

Data privacy or safeguarding data from potential threats has become a critical issue in our data-centric world. Among the developed mechanisms catering to the objective of privacy preservation, differential privacy has emerged as a popular and effective technique which provides the required level of user privacy. In our work, we have information theoretically analyzed differential privacy in a multiple query-response based environment. We have evaluated our model on a real-world database and subsequently evaluated the effects of externally added noise on the resulting privacy. The simulated results confirm the notion that the privacy risk is inversely proportional to the amount of noise added in the system (defined by \( \varepsilon \)).

[1]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[2]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[3]  Frank McSherry,et al.  Privacy integrated queries: an extensible platform for privacy-preserving data analysis , 2009, SIGMOD Conference.

[4]  Ninghui Li,et al.  t-Closeness: Privacy Beyond k-Anonymity and l-Diversity , 2007, 2007 IEEE 23rd International Conference on Data Engineering.

[5]  Anco Hundepool The CASC Project , 2002, Inference Control in Statistical Databases.

[6]  Josep Domingo-Ferrer,et al.  From t-Closeness-Like Privacy to Postrandomization via Information Theory , 2010, IEEE Transactions on Knowledge and Data Engineering.

[7]  Rathindra Sarathy,et al.  Evaluating Laplace Noise Addition to Satisfy Differential Privacy for Numeric Data , 2011, Trans. Data Priv..