Two-Party Privacy Games: How Users Perturb When Learners Preempt

Internet tracking technologies and wearable electronics provide a vast amount of data to machine learning algorithms. This stock of data stands to increase with the developments of the internet of things and cyber-physical systems. Clearly, these technologies promise benefits. But they also raise the risk of sensitive information disclosure. To mitigate this risk, machine learning algorithms can add noise to outputs according to the formulations provided by differential privacy. At the same time, users can fight for privacy by injecting noise into the data that they report. In this paper, we conceptualize the interactions between privacy and accuracy and between user (input) perturbation and learner (output) perturbation in machine learning, using the frameworks of empirical risk minimization, differential privacy, and Stackelberg games. In particular, we solve for the Stackelberg equilibrium for the case of an averaging query. We find that, in equilibrium, either the users perturb their data before submission or the learner perturbs the machine learning output, but never both. Specifically, the learner perturbs if and only if the number of users is greater than a threshold which increases with the degree to which incentives are misaligned. Provoked by these conclusions - and by some observations from privacy ethics - we also suggest future directions. While other work in this area has studied privacy markets and mechanism design for truthful reporting of user information, we take a different viewpoint by considering both user and learner perturbation. We hope that this effort will open the door to future work in the area of differential privacy games.

[1]  Moni Naor,et al.  On the Difficulties of Disclosure Prevention in Statistical Databases or The Case for Differential Privacy , 2010, J. Priv. Confidentiality.

[2]  Anand D. Sarwate,et al.  Differentially Private Empirical Risk Minimization , 2009, J. Mach. Learn. Res..

[3]  Ashwin Machanavajjhala,et al.  l-Diversity: Privacy Beyond k-Anonymity , 2006, ICDE.

[4]  Louis D. Brandeis,et al.  The Right to Privacy , 1890 .

[5]  Romit Roy Choudhury,et al.  Hiding stars with fireworks: location privacy through camouflage , 2009, MobiCom '09.

[6]  Helen Nissenbaum,et al.  Trackmenot: Resisting Surveillance in Web Search , 2015 .

[7]  Latanya Sweeney,et al.  k-Anonymity: A Model for Protecting Privacy , 2002, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[8]  John C. Mitchell,et al.  Third-Party Web Tracking: Policy and Technology , 2012, 2012 IEEE Symposium on Security and Privacy.

[9]  T. Graepel,et al.  Private traits and attributes are predictable from digital records of human behavior , 2013, Proceedings of the National Academy of Sciences.

[10]  R. Gavison Privacy and the Limits of Law , 1980 .

[11]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[12]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[13]  Jens Grossklags,et al.  A Short Paper on the Incentives to Share Private Information for Population Estimates , 2015, Financial Cryptography.

[14]  H. Stackelberg,et al.  Marktform und Gleichgewicht , 1935 .

[15]  R.S. Parker,et al.  A model-based algorithm for blood glucose control in Type I diabetic patients , 1999, IEEE Transactions on Biomedical Engineering.

[16]  Ting Yu,et al.  Empirical privacy and empirical utility of anonymized data , 2013, 2013 IEEE 29th International Conference on Data Engineering Workshops (ICDEW).

[17]  Ninghui Li,et al.  Membership privacy: a unifying framework for privacy definitions , 2013, CCS.

[18]  David Xiao,et al.  Is privacy compatible with truthfulness? , 2013, ITCS '13.

[19]  David J. Danelski,et al.  Privacy and Freedom , 1968 .

[20]  Scott R. Peppet Regulating the Internet of Things: First Steps Toward Managing Discrimination, Privacy, Security & Consent , 2014 .

[21]  Helen Nissenbaum,et al.  Privacy in Context - Technology, Policy, and the Integrity of Social Life , 2009 .

[22]  Finn Brunton,et al.  Obfuscation: A User's Guide for Privacy and Protest , 2015 .

[23]  Ian R. Kerr,et al.  Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society , 2009 .

[24]  Helen Gill,et al.  Cyber-Physical Systems , 2019, 2019 IEEE International Conference on Mechatronics (ICM).

[25]  Arvind Narayanan,et al.  The Web Never Forgets: Persistent Tracking Mechanisms in the Wild , 2014, CCS.

[26]  Edward A. Lee Cyber Physical Systems: Design Challenges , 2008, 2008 11th IEEE International Symposium on Object and Component-Oriented Real-Time Distributed Computing (ISORC).

[27]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[28]  Melanie Swan,et al.  Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0 , 2012, J. Sens. Actuator Networks.

[29]  Aaron Roth,et al.  Selling privacy at auction , 2015, Games Econ. Behav..