A mean-field stackelberg game approach for obfuscation adoption in empirical risk minimization

Data ecosystems are becoming larger and more complex, while privacy concerns are threatening to erode their potential benefits. Recently, users have developed obfuscation techniques that issue fake search engine queries, undermine location tracking algorithms, or evade government surveillance. These techniques raise one conflict between each user and the machine learning algorithms which track the users, and one conflict between the users themselves. We use game theory to capture the first conflict with a Stackelberg game and the second conflict with a mean field game. Both are combined into a bi-level framework which quantifies accuracy using empirical risk minimization and privacy using differential privacy. We identify necessary and sufficient conditions under which 1) each user is incentivized to obfuscate if other users are obfuscating, 2) the tracking algorithm can avoid this by promising a level of privacy protection, and 3) this promise is incentive-compatible for the tracking algorithm.

[1]  Alessandro Acquisti,et al.  The challenges of personal data markets and privacy , 2015, Electron. Mark..

[2]  Helen Nissenbaum,et al.  Trackmenot: Resisting Surveillance in Web Search , 2015 .

[3]  Rui Zhang,et al.  Secure and resilient distributed machine learning under adversarial environments , 2015, 2015 18th International Conference on Information Fusion (Fusion).

[4]  Jens Grossklags,et al.  A Game-Theoretic Study on Non-monetary Incentives in Data Analytics Projects with Privacy Implications , 2015, 2015 IEEE 28th Computer Security Foundations Symposium.

[5]  Quanyan Zhu,et al.  Dynamic Differential Privacy for ADMM-Based Distributed Classification Learning , 2017, IEEE Transactions on Information Forensics and Security.

[6]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[7]  Rui Zhang,et al.  A game-theoretic analysis of label flipping attacks on distributed support vector machines , 2017, 2017 51st Annual Conference on Information Sciences and Systems (CISS).

[8]  Anand D. Sarwate,et al.  Differentially Private Empirical Risk Minimization , 2009, J. Mach. Learn. Res..

[9]  H. Stackelberg,et al.  Marktform und Gleichgewicht , 1935 .

[10]  Ronnie Sircar,et al.  Bertrand and Cournot Mean Field Games , 2015 .

[11]  Christos H. Papadimitriou,et al.  Strategic Classification , 2015, ITCS.

[12]  Quanyan Zhu,et al.  A Stackelberg game perspective on the conflict between machine learning and data obfuscation , 2016, 2016 IEEE International Workshop on Information Forensics and Security (WIFS).

[13]  Melanie Swan,et al.  Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0 , 2012, J. Sens. Actuator Networks.

[14]  Aaron Roth,et al.  Selling privacy at auction , 2015, Games Econ. Behav..

[15]  Romit Roy Choudhury,et al.  Hiding stars with fireworks: location privacy through camouflage , 2009, MobiCom '09.

[16]  Tamer Basar,et al.  Privacy constrained information processing , 2015, 2015 54th IEEE Conference on Decision and Control (CDC).

[17]  Scott R. Peppet Regulating the Internet of Things: First Steps Toward Managing Discrimination, Privacy, Security & Consent , 2014 .

[18]  Quanyan Zhu,et al.  A Dual Perturbation Approach for Differential Private ADMM-Based Distributed Empirical Risk Minimization , 2016, AISec@CCS.

[19]  Jens Grossklags,et al.  A Short Paper on the Incentives to Share Private Information for Population Estimates , 2015, Financial Cryptography.

[20]  Rico Neumann,et al.  Obfuscation: A user’s guide for privacy and protest , 2017, New Media & Society.

[21]  John C. Mitchell,et al.  Third-Party Web Tracking: Policy and Technology , 2012, 2012 IEEE Symposium on Security and Privacy.

[22]  Wayne H. Wolf,et al.  Cyber-physical Systems , 2009, Computer.

[23]  David Xiao,et al.  Is privacy compatible with truthfulness? , 2013, ITCS '13.