Are We There Yet? Timing and Floating-Point Attacks on Differential Privacy Systems

Differential privacy is a de facto privacy framework that has seen adoption in practice via a number of mature software platforms. Implementation of differentially private (DP) mechanisms has to be done carefully to ensure end-to-end security guarantees. In this paper we study two implementation flaws in the noise generation commonly used in DP systems. First we examine the Gaussian mechanism’s susceptibility to a floating-point representation attack. The premise of this first vulnerability is similar to the one carried out by Mironov in 2011 against the Laplace mechanism. Our experiments show the attack’s success against DP algorithms, including deep learning models trained using differentially-private stochastic gradient descent. In the second part of the paper we study discrete counterparts of the Laplace and Gaussian mechanisms that were previously proposed to alleviate the shortcomings of floating-point representation of real numbers. We show that such implementations unfortunately suffer from another side channel: a novel timing attack. An observer that can measure the time to draw (discrete) Laplace or Gaussian noise can predict the noise magnitude, which can then be used to recover sensitive attributes. This attack invalidates differential privacy guarantees of systems implementing such mechanisms. We demonstrate that several commonly used, state-of-the-art implementations of differential privacy are susceptible to these attacks. We report success rates up to 92.56% for floating point attacks on DP-SGD, and up to 99.65% for end-to-end timing attacks on private sum protected with discrete Laplace. Finally, we evaluate and suggest partial mitigations.

[1]  Stefano Braghin,et al.  Secure Random Sampling in Differential Privacy , 2021, ESORICS.

[2]  Martin T. Vechev,et al.  DP-Sniper: Black-Box Discovery of Differential Privacy Violations using Classifiers , 2021, 2021 IEEE Symposium on Security and Privacy (SP).

[3]  P. Kairouz,et al.  The Distributed Discrete Gaussian Mechanism for Federated Learning with Secure Aggregation , 2021, ICML.

[4]  Debdeep Mukhopadhyay,et al.  Compact and Secure Generic Discrete Gaussian Sampler based on HW/SW Co-design , 2020, 2020 Asian Hardware Oriented Security and Trust Symposium (AsianHOST).

[5]  Thomas Steinke,et al.  The Discrete Gaussian for Differential Privacy , 2020, NeurIPS.

[6]  Christina Ilvento,et al.  Implementing the Exponential Mechanism with Base-2 Differential Privacy , 2019, CCS.

[7]  Weijie J. Su,et al.  Deep Learning with Gaussian Differential Privacy , 2019, Harvard data science review.

[8]  Wouter Joosen,et al.  Timeless Timing Attacks: Exploiting Concurrency to Leak Secrets over Remote Connections , 2020, USENIX Security Symposium.

[9]  Calton Pu,et al.  Differentially Private Model Publishing for Deep Learning , 2019, 2019 IEEE Symposium on Security and Privacy (SP).

[10]  Janardhan Kulkarni,et al.  An Algorithmic Framework For Differentially Private Data Analysis on Trusted Processors , 2018, NeurIPS.

[11]  Frederik Vercauteren,et al.  Constant-Time Discrete Gaussian Sampling , 2018, IEEE Transactions on Computers.

[12]  Timon Gehr,et al.  DP-Finder: Finding Differential Privacy Violations by Sampling and Optimization , 2018, CCS.

[13]  John M. Abowd,et al.  The U.S. Census Bureau Adopts Differential Privacy , 2018, KDD.

[14]  Thomas Steinke,et al.  Composable and versatile privacy via truncated CDP , 2018, STOC.

[15]  Meng Wu,et al.  Eliminating timing side-channel leaks using program repair , 2018, ISSTA.

[16]  Danfeng Zhang,et al.  Detecting Violations of Differential Privacy , 2018, CCS.

[17]  Yu-Xiang Wang,et al.  Improving the Gaussian Mechanism for Differential Privacy: Analytical Calibration and Optimal Denoising , 2018, ICML.

[18]  Ayesha Khalid,et al.  On Practical Discrete Gaussian Samplers for Lattice-Based Cryptography , 2018, IEEE Transactions on Computers.

[19]  H. Brendan McMahan,et al.  Learning Differentially Private Recurrent Language Models , 2017, ICLR.

[20]  Salil P. Vadhan,et al.  Differential Privacy on Finite Computers , 2017, ITCS.

[21]  Dawn Xiaodong Song,et al.  Towards Practical Differential Privacy for SQL Queries , 2017, Proc. VLDB Endow..

[22]  Úlfar Erlingsson,et al.  Prochlo: Strong Privacy for Analytics in the Crowd , 2017, SOSP.

[23]  Daniele Micciancio,et al.  Gaussian Sampling over the Integers: Efficient, Generic, Constant-Time , 2017, CRYPTO.

[24]  Ilya Mironov,et al.  Rényi Differential Privacy , 2017, 2017 IEEE 30th Computer Security Foundations Symposium (CSF).

[25]  Tanja Lange,et al.  Flush, Gauss, and reload : a cache attack on the BLISS lattice-based signature scheme , 2016 .

[26]  Ian Goodfellow,et al.  Deep Learning with Differential Privacy , 2016, CCS.

[27]  Guy N. Rothblum,et al.  Concentrated Differential Privacy , 2016, ArXiv.

[28]  Dale Miller,et al.  Preserving differential privacy under finite-precision semantics , 2013, Theor. Comput. Sci..

[29]  Sorin Lerner,et al.  On Subnormal Floating Point and Abnormal Timing , 2015, 2015 IEEE Symposium on Security and Privacy.

[30]  Alexander J. Smola,et al.  Privacy for Free: Posterior Sampling and Stochastic Gradient Monte Carlo , 2015, ICML.

[31]  János Folláth Gaussian Sampling in Lattice Based Cryptography , 2014 .

[32]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[33]  Raef Bassily,et al.  Differentially Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds , 2014, 1405.7085.

[34]  Andreas Haeberlen,et al.  Differential Privacy: An Economic Method for Choosing Epsilon , 2014, 2014 IEEE 27th Computer Security Foundations Symposium.

[35]  Anand D. Sarwate,et al.  Stochastic gradient descent with differentially private updates , 2013, 2013 IEEE Global Conference on Signal and Information Processing.

[36]  Léo Ducas,et al.  Lattice Signatures and Bimodal Gaussians , 2013, IACR Cryptol. ePrint Arch..

[37]  Chris Peikert,et al.  On Ideal Lattices and Learning with Errors over Rings , 2010, JACM.

[38]  Ilya Mironov,et al.  On significance of the least significant bits for differential privacy , 2012, CCS.

[39]  Bart Coppens,et al.  Compiler mitigations for time attacks on modern x86 processors , 2012, TACO.

[40]  Andreas Haeberlen,et al.  Differential Privacy Under Fire , 2011, USENIX Security Symposium.

[41]  Danfeng Zhang,et al.  Predictive black-box mitigation of timing channels , 2010, CCS '10.

[42]  Hovav Shacham,et al.  Hey, you, get off of my cloud: exploring information leakage in third-party compute clouds , 2009, CCS.

[43]  Frank McSherry,et al.  Privacy integrated queries: an extensible platform for privacy-preserving data analysis , 2009, SIGMOD Conference.

[44]  Wayne Luk,et al.  A hardware Gaussian noise generator using the Box-Muller method and its error analysis , 2006, IEEE Transactions on Computers.

[45]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[46]  G. Marsaglia,et al.  The Ziggurat Method for Generating Random Variables , 2000 .

[47]  T. A. Bray,et al.  A Convenient Method for Generating Normal Variables , 1964 .

[48]  G. Marsaglia Generating a Variable from the Tail of the Normal Distribution , 1964 .