A Graph Symmetrisation Bound on Channel Information Leakage under Blowfish Privacy

Blowfish privacy is a recent generalisation of differential privacy that enables improved utility while maintaining privacy policies with semantic guarantees, a factor that has driven the popularity of differential privacy in computer science. This paper relates Blowfish privacy to an important measure of privacy loss of information channels from the communications theory community: min-entropy leakage. Symmetry in an input data neighbouring relation is central to known connections between differential privacy and min-entropy leakage. But while differential privacy exhibits strong symmetry, Blowfish neighbouring relations correspond to arbitrary simple graphs owing to the framework's flexible privacy policies. To bound the min-entropy leakage of Blowfish-private mechanisms we organise our analysis over symmetrical partitions corresponding to orbits of graph automorphism groups. A construction meeting our bound with asymptotic equality demonstrates sharpness.

[1]  Mário S. Alvim,et al.  Differential Privacy: On the Trade-Off between Utility and Information Leakage , 2011, Formal Aspects in Security and Trust.

[2]  Guy N. Rothblum,et al.  Concentrated Differential Privacy , 2016, ArXiv.

[3]  Geoffrey Smith,et al.  On the Foundations of Quantitative Information Flow , 2009, FoSSaCS.

[4]  Divesh Srivastava,et al.  Differentially Private Spatial Decompositions , 2011, 2012 IEEE 28th International Conference on Data Engineering.

[5]  Mário S. Alvim,et al.  On the information leakage of differentially-private mechanisms , 2015, J. Comput. Secur..

[6]  Ashwin Machanavajjhala,et al.  Blowfish privacy: tuning privacy-utility trade-offs using policies , 2013, SIGMOD Conference.

[7]  Ashwin Machanavajjhala,et al.  Pufferfish , 2014, ACM Trans. Database Syst..

[8]  Raef Bassily,et al.  Differentially Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds , 2014, 1405.7085.

[9]  Ian Goodfellow,et al.  Deep Learning with Differential Privacy , 2016, CCS.

[10]  Ilya Mironov,et al.  On significance of the least significant bits for differential privacy , 2012, CCS.

[11]  Ilya Mironov,et al.  Rényi Differential Privacy , 2017, 2017 IEEE 30th Computer Security Foundations Symposium (CSF).

[12]  Ashwin Machanavajjhala,et al.  No free lunch in data privacy , 2011, SIGMOD '11.

[13]  Benjamin I. P. Rubinstein,et al.  Pain-Free Random Differential Privacy with Sensitivity Sampling , 2017, ICML.

[14]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[15]  Ashwin Machanavajjhala,et al.  Design of Policy-Aware Differentially Private Algorithms , 2015, Proc. VLDB Endow..

[16]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[17]  Aleksandar Nikolov,et al.  The geometry of differential privacy: the sparse and approximate cases , 2012, STOC '13.

[18]  K. Brown,et al.  Graduate Texts in Mathematics , 1982 .

[19]  Mário S. Alvim,et al.  Quantitative Information Flow and Applications to Differential Privacy , 2011, FOSAD.

[20]  Larry A. Wasserman,et al.  Random Differential Privacy , 2011, J. Priv. Confidentiality.

[21]  Catuscia Palamidessi,et al.  Quantitative Notions of Leakage for One-try Attacks , 2009, MFPS.