Balancing Privacy-Utility of Differential Privacy Mechanism: A Collaborative Perspective

Differential privacy mechanism can maintain privacy-utility monotonicity. Thus, differential privacy mechanism does not obtain privacy-utility balance for numerical data. To this end, we provide privacy-utility balance of differential privacy mechanism with the collaborative perspective in this paper. First, we constructed the collaborative model achieving privacy-utility balance of differential privacy mechanism. Second, we presented the collaborative algorithm of differential privacy mechanism under our collaborative model. Third, our theoretical analysis showed that the collaborative algorithm of differential privacy mechanism could keep privacy-utility balance. Finally, our experimental results demonstrated that the collaborative differential privacy mechanism can maintain privacy-utility balance. Thus, we provide a new collaborative model to solve the privacy-utility balance problem of differential privacy mechanism. Our collaborative algorithm is easy to apply to query processing of numerical data.

[1]  Hai Liu,et al.  Adaptive Gaussian Mechanism Based on Expected Data Utility under Conditional Filtering Noise , 2018, KSII Trans. Internet Inf. Syst..

[2]  Divesh Srivastava,et al.  Answering Range Queries Under Local Differential Privacy , 2018, Proc. VLDB Endow..

[3]  Xiaohua Jia,et al.  Releasing Correlated Trajectories: Towards High Utility and Optimal Differential Privacy , 2020, IEEE Transactions on Dependable and Secure Computing.

[4]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[5]  Salil Vadhan,et al.  Privacy Games , 2020, ACM Trans. Economics and Comput..

[6]  Vaidy S. Sunderam,et al.  FAST: differentially private real-time aggregate monitor with filtering and adaptive sampling , 2013, SIGMOD '13.

[7]  Ninghui Li,et al.  Answering Multi-Dimensional Analytical Queries under Local Differential Privacy , 2019, SIGMOD Conference.

[8]  Nan Cheng,et al.  PROTECT: Efficient Password-Based Threshold Single-Sign-On Authentication for Mobile Users against Perpetual Leakage , 2020, IEEE Transactions on Mobile Computing.

[9]  Ashwin Machanavajjhala,et al.  Blowfish privacy: tuning privacy-utility trade-offs using policies , 2013, SIGMOD Conference.

[10]  Aleksandar Nikolov,et al.  The geometry of differential privacy: the sparse and approximate cases , 2012, STOC '13.

[11]  Josep Domingo-Ferrer,et al.  Enhancing data utility in differential privacy via microaggregation-based k\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{docume , 2014, The VLDB Journal.

[12]  Ting Yu,et al.  Conservative or liberal? Personalized differential privacy , 2015, 2015 IEEE 31st International Conference on Data Engineering.

[13]  Guangzhong Sun,et al.  Driving with knowledge from the physical world , 2011, KDD.

[14]  Ferdinando Fioretto,et al.  Differential Privacy for Stackelberg Games , 2020, ArXiv.

[15]  Xiaoqian Jiang,et al.  Selecting Optimal Subset to Release Under Differentially Private M-Estimators from Hybrid Datasets , 2018, IEEE Transactions on Knowledge and Data Engineering.

[16]  Frank McSherry,et al.  Privacy integrated queries: an extensible platform for privacy-preserving data analysis , 2009, SIGMOD Conference.

[17]  Thomas Steinke,et al.  New Oracle-Efficient Algorithms for Private Synthetic Data Release , 2020, ICML.

[18]  Changgen Peng,et al.  Bounded privacy-utility monotonicity indicating bounded tradeoff of differential privacy mechanisms , 2020, Theor. Comput. Sci..

[19]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[20]  Wenqi Wei,et al.  Private and Truthful Aggregative Game for Large-Scale Spectrum Sharing , 2017, IEEE Journal on Selected Areas in Communications.

[21]  Quanyan Zhu,et al.  A Stackelberg game perspective on the conflict between machine learning and data obfuscation , 2016, 2016 IEEE International Workshop on Information Forensics and Security (WIFS).

[22]  Ufuk Topcu,et al.  Differential Privacy on the Unit Simplex via the Dirichlet Mechanism , 2021, IEEE Transactions on Information Forensics and Security.

[23]  Bing-Rong Lin,et al.  Information Measures in Statistical Privacy and Data Processing Applications , 2015, TKDD.

[24]  Masatoshi Yoshikawa,et al.  Quantifying Differential Privacy in Continuous Data Release Under Temporal Correlations , 2017, IEEE Transactions on Knowledge and Data Engineering.

[25]  Janardhan Kulkarni,et al.  Differentially Private Release of Synthetic Graphs , 2020, SODA.

[26]  Pascal Van Hentenryck,et al.  OptStream: Releasing Time Series Privately (Extended Abstract) , 2020, IJCAI.

[27]  Xiang Cheng,et al.  Differentially private multi-party high-dimensional data publishing , 2016, 2016 IEEE 32nd International Conference on Data Engineering (ICDE).

[28]  Grace Hui Yang,et al.  Anonymizing Query Logs by Differential Privacy , 2016, SIGIR.