Can Two Walk Together: Privacy Enhancing Methods and Preventing Tracking of Users

We present a new concern when collecting data from individuals that arises from the attempt to mitigate privacy leakage in multiple reporting: tracking of users participating in the data collection via the mechanisms added to provide privacy. We present several definitions for untrackable mechanisms, inspired by the differential privacy framework. Specifically, we define the trackable parameter as the log of the maximum ratio between the probability that a set of reports originated from a single user and the probability that the same set of reports originated from two users (with the same private value). We explore the implications of this new definition. We show how differentially private and untrackable mechanisms can be combined to achieve a bound for the problem of detecting when a certain user changed their private value. Examining Google's deployed solution for everlasting privacy, we show that RAPPOR (Erlingsson et al. ACM CCS, 2014) is trackable in our framework for the parameters presented in their paper. We analyze a variant of randomized response for collecting statistics of single bits, Bitwise Everlasting Privacy, that achieves good accuracy and everlasting privacy, while only being reasonably untrackable, specifically grows linearly in the number of reports. For collecting statistics about data from larger domains (for histograms and heavy hitters) we present a mechanism that prevents tracking for a limited number of responses. We also present the concept of Mechanism Chaining, using the output of one mechanism as the input of another, in the scope of Differential Privacy, and show that the chaining of an $\varepsilon_1$-LDP mechanism with an $\varepsilon_2$-LDP mechanism is $\ln\frac{e^{\varepsilon_1+\varepsilon_2}+1}{e^{\varepsilon_1}+e^{\varepsilon_2}}$-LDP and that this bound is tight.

[1]  Úlfar Erlingsson,et al.  RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response , 2014, CCS.

[2]  Raef Bassily,et al.  Local, Private, Efficient Protocols for Succinct Histograms , 2015, STOC.

[3]  Elaine Shi,et al.  Private and Continual Release of Statistics , 2010, TSEC.

[4]  Moni Naor,et al.  The Privacy of the Analyst and the Power of the State , 2012, FOCS.

[5]  Janardhan Kulkarni,et al.  Collecting Telemetry Data Privately , 2017, NIPS.

[6]  N. Alon,et al.  The Probabilistic Method, Second Edition , 2000 .

[7]  Noga Alon,et al.  The Probabilistic Method , 2015, Fundamentals of Ramsey Theory.

[8]  Aaron Roth,et al.  Local Differential Privacy for Evolving Data , 2018, NeurIPS.

[9]  Guy N. Rothblum,et al.  Boosting and Differential Privacy , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[10]  Moni Naor,et al.  Differential privacy under continual observation , 2010, STOC '10.

[11]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[12]  Moni Naor,et al.  Pan-Private Streaming Algorithms , 2010, ICS.

[13]  Sofya Raskhodnikova,et al.  What Can We Learn Privately? , 2008, 2008 49th Annual IEEE Symposium on Foundations of Computer Science.

[14]  Wanrong Zhang,et al.  Privately detecting changes in unknown distributions , 2020, ICML.

[15]  Yajun Mei,et al.  Differentially Private Change-Point Detection , 2018, NeurIPS.

[16]  Úlfar Erlingsson,et al.  Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity , 2018, SODA.

[17]  Moni Naor,et al.  How to (not) Share a Password: Privacy Preserving Protocols for Finding Heavy Hitters with Adversarial Behavior , 2019, IACR Cryptol. ePrint Arch..

[18]  Raef Bassily,et al.  Practical Locally Private Heavy Hitters , 2017, NIPS.