DP-Sniper: Black-Box Discovery of Differential Privacy Violations using Classifiers

We present DP-Sniper, a practical black-box method that automatically finds violations of differential privacy.DP-Sniper is based on two key ideas: (i) training a classifier to predict if an observed output was likely generated from one of two possible inputs, and (ii) transforming this classifier into an approximately optimal attack on differential privacy.Our experimental evaluation demonstrates that DP-Sniper obtains up to 12.4 times stronger guarantees than state-of-the-art, while being 15.5 times faster. Further, we show that DP-Sniper is effective in exploiting floating-point vulnerabilities of naively implemented algorithms: it detects that a supposedly 0.1-differentially private implementation of the Laplace mechanism actually does not satisfy even 0.25-differential privacy.

[1]  E. S. Pearson,et al.  On the Problem of the Most Efficient Tests of Statistical Hypotheses , 1933 .

[2]  Larry A. Wasserman,et al.  Differential privacy for functions and functional data , 2012, J. Mach. Learn. Res..

[3]  Gilles Barthe,et al.  Probabilistic Relational Reasoning for Differential Privacy , 2012, TOPL.

[4]  Vitaly Shmatikov,et al.  Membership Inference Attacks Against Machine Learning Models , 2016, 2017 IEEE Symposium on Security and Privacy (SP).

[5]  William K. C. Lam,et al.  Differentially Private SQL with Bounded User Contribution , 2019, Proc. Priv. Enhancing Technol..

[6]  Ilya Mironov,et al.  On significance of the least significant bits for differential privacy , 2012, CCS.

[7]  Vishal Jagannath Ravi Automated methods for checking differential privacy , 2019 .

[8]  Emiliano De Cristofaro,et al.  Knock Knock, Who's There? Membership Inference on Aggregate Location Data , 2017, NDSS.

[9]  Danfeng Zhang,et al.  LightDP: towards automating differential privacy proofs , 2016, POPL.

[10]  Reza Shokri,et al.  Ultimate Power of Inference Attacks: Privacy Risks of Learning High-Dimensional Graphical Models , 2019 .

[11]  Daniel Kifer,et al.  CheckDP: An Automated and Integrated Approach for Proving Differential Privacy or Finding Precise Counterexamples , 2020, CCS.

[12]  Ninghui Li,et al.  Understanding the Sparse Vector Technique for Differential Privacy , 2016, Proc. VLDB Endow..

[13]  Sofya Raskhodnikova,et al.  Testing the Lipschitz Property over Product Distributions with Applications to Data Privacy , 2013, TCC.

[14]  Anna C. Gilbert,et al.  Property Testing For Differential Privacy , 2018, 2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[15]  Úlfar Erlingsson,et al.  RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response , 2014, CCS.

[16]  L. Wasserman,et al.  A Statistical Framework for Differential Privacy , 2008, 0811.2501.

[17]  Xiyang Liu,et al.  Minimax Rates of Estimating Approximate Differential Privacy , 2019, NeurIPS 2019.

[18]  Benjamin C. Pierce,et al.  Distance makes the types grow stronger: a calculus for differential privacy , 2010, ICFP '10.

[19]  Catuscia Palamidessi,et al.  Differential Inference Testing: A Practical Approach to Evaluate Sanitizations of Datasets , 2019, 2019 IEEE Security and Privacy Workshops (SPW).

[20]  P. Massart The Tight Constant in the Dvoretzky-Kiefer-Wolfowitz Inequality , 1990 .

[21]  Danfeng Zhang,et al.  Detecting Violations of Differential Privacy , 2018, CCS.

[22]  Prateek Mittal,et al.  Investigating Statistical Privacy Frameworks from the Perspective of Hypothesis Testing , 2019, Proc. Priv. Enhancing Technol..

[23]  Amir Houmansadr,et al.  Comprehensive Privacy Analysis of Deep Learning: Passive and Active White-box Inference Attacks against Centralized and Federated Learning , 2018, 2019 IEEE Symposium on Security and Privacy (SP).

[24]  Cosma Rohilla Shalizi,et al.  Advanced Data Analysis from an Elementary Point of View , 2012 .

[25]  Danfeng Zhang,et al.  Proving differential privacy with shadow execution , 2019, PLDI.

[26]  Reza Shokri,et al.  Machine Learning with Membership Privacy using Adversarial Regularization , 2018, CCS.

[27]  Andreas Haeberlen,et al.  Testing differential privacy with dual interpreters , 2020, Proc. ACM Program. Lang..

[28]  Salil P. Vadhan,et al.  Differential Privacy on Finite Computers , 2017, ITCS.

[29]  Somesh Jha,et al.  Privacy Risk in Machine Learning: Analyzing the Connection to Overfitting , 2017, 2018 IEEE 31st Computer Security Foundations Symposium (CSF).

[30]  Benjamin Grégoire,et al.  Proving Differential Privacy via Probabilistic Couplings , 2016, 2016 31st Annual ACM/IEEE Symposium on Logic in Computer Science (LICS).

[31]  Pierre-Yves Strub,et al.  Advanced Probabilistic Couplings for Differential Privacy , 2016, CCS.

[32]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[33]  Gilles Barthe,et al.  Higher-Order Approximate Relational Refinement Types for Mechanism Design and Differential Privacy , 2014, POPL.

[34]  Aws Albarghouthi,et al.  Synthesizing coupling proofs of differential privacy , 2017, Proc. ACM Program. Lang..

[35]  Tim Roughgarden,et al.  Universally utility-maximizing privacy mechanisms , 2008, STOC '09.

[36]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[37]  Timon Gehr,et al.  DP-Finder: Finding Differential Privacy Violations by Sampling and Optimization , 2018, CCS.

[38]  Jonathan Ullman,et al.  The Price of Selection in Differential Privacy , 2017, COLT.

[39]  George Danezis,et al.  Verified Computational Differential Privacy with Applications to Smart Metering , 2013, 2013 IEEE 26th Computer Security Foundations Symposium.

[40]  E. S. Pearson,et al.  THE USE OF CONFIDENCE OR FIDUCIAL LIMITS ILLUSTRATED IN THE CASE OF THE BINOMIAL , 1934 .