Robust and Private Bayesian Inference

We examine the robustness and privacy of Bayesian inference, under assumptions on the prior, and with no modifications to the Bayesian framework. First, we generalise the concept of differential privacy to arbitrary dataset distances, outcome spaces and distribution families. We then prove bounds on the robustness of the posterior, introduce a posterior sampling mechanism, show that it is differentially private and provide finite sample bounds for distinguishability-based privacy under a strong adversarial model. Finally, we give examples satisfying our assumptions.

[1]  Larry A. Wasserman,et al.  Differential privacy for functions and functional data , 2012, J. Mach. Learn. Res..

[2]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[3]  Li Xiong,et al.  Bayesian inference under differential privacy , 2012 .

[4]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[5]  Catuscia Palamidessi,et al.  Broadening the Scope of Differential Privacy Using Metrics , 2013, Privacy Enhancing Technologies.

[6]  Ling Huang,et al.  Learning in a Large Function Space: Privacy-Preserving Mechanisms for SVM Learning , 2009, J. Priv. Confidentiality.

[7]  Martin J. Wainwright,et al.  Local privacy and statistical minimax rates , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[8]  Anand D. Sarwate,et al.  Differentially Private Empirical Risk Minimization , 2009, J. Mach. Learn. Res..

[9]  Kamalika Chaudhuri,et al.  Convergence Rates for Differentially Private Statistical Estimation , 2012, ICML.

[10]  B. Ripley,et al.  Robust Statistics , 2018, Wiley Series in Probability and Statistics.

[11]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[12]  L. Lecam Convergence of Estimates Under Dimensionality Restrictions , 1973 .

[13]  Kunal Talwar,et al.  Mechanism Design via Differential Privacy , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[14]  Martin J. Wainwright,et al.  Local Privacy, Data Processing Inequalities, and Statistical Minimax Rates , 2013, 1302.3203.

[15]  Darakhshan J. Mir Differentially-private learning and information theory , 2012, EDBT-ICDT '12.

[16]  Peter Harremoës,et al.  Refinements of Pinsker's inequality , 2003, IEEE Trans. Inf. Theory.

[17]  Daniel A. Spielman,et al.  Spectral Graph Theory and its Applications , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[18]  M. Degroot Optimal Statistical Decisions , 1970 .

[19]  L. Wasserman,et al.  A Statistical Framework for Differential Privacy , 2008, 0811.2501.

[20]  P. Bickel,et al.  Mathematical Statistics: Basic Ideas and Selected Topics , 1977 .

[21]  J. Berger Statistical Decision Theory and Bayesian Analysis , 1988 .

[22]  Peter J. Bickel,et al.  Mathematical Statistics: Basic Ideas and Selected Topics, Volume II , 2015 .

[23]  Cynthia Dwork,et al.  Differential privacy and robust statistics , 2009, STOC '09.

[24]  E. Ordentlich,et al.  Inequalities for the L1 Deviation of the Empirical Distribution , 2003 .

[25]  Frank McSherry,et al.  Probabilistic Inference and Differential Privacy , 2010, NIPS.

[26]  A. Dawid,et al.  Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory , 2004, math/0410076.

[27]  D. Ruppert Robust Statistics: The Approach Based on Influence Functions , 1987 .

[28]  V. I. Norkin,et al.  Stochastic Lipschitz functions , 1986 .

[29]  Cynthia Dwork,et al.  Differential Privacy for Statistics: What we Know and What we Want to Learn , 2010, J. Priv. Confidentiality.

[30]  André Elisseeff,et al.  Stability and Generalization , 2002, J. Mach. Learn. Res..

[31]  Fabio Roli,et al.  Machine Learning Methods for Computer Security (Dagstuhl Perspectives Workshop 12371) , 2012, Dagstuhl Reports.

[32]  Carolyn Pillers Dobler Mathematical Statistics: Basic Ideas and Selected Topics , 2002 .