Oblivious Mechanisms in Differential Privacy: Experiments, Conjectures, and Open Questions

Differential privacy (DP) is a framework to quantify to what extent individual privacy in a statistical database is preserved while releasing useful aggregate information about the database. In this work, we aim an exploratory study to understand questions related to the optimality of noise generation mechanisms (NGMs) in differential privacy by taking into consideration the (i) query sensitivity, (ii) query side information, and (iii) the presence of longitudinal and collusion attacks. The results/observations from our study serve three important purposes: (i) provide us with conjectures on appropriate (in the sense of privacy-utility tradeoffs) oblivious NGM selection for scalar queries in both non-Bayesian as well as Bayesian user settings, (ii) provide supporting evidence and counterexamples to existing theory results on the optimality of NGMs when they are tested on a relaxed assumption set, and (ii) lead to a string of interesting open questions for the theory community in relation to the design and analysis of provably optimal oblivious differential privacy mechanisms.