Significance Motivated by tensions between data privacy for individual citizens, and societal priorities such as counterterrorism, we introduce a computational model that distinguishes between parties for whom privacy is explicitly protected, and those for whom it is not (the “targeted” subpopulation). Within this framework, we provide provably privacy-preserving algorithms for targeted search in social networks. We validate the utility of our algorithms with extensive computational experiments on two large-scale social network datasets. Motivated by tensions between data privacy for individual citizens and societal priorities such as counterterrorism and the containment of infectious disease, we introduce a computational model that distinguishes between parties for whom privacy is explicitly protected, and those for whom it is not (the targeted subpopulation). The goal is the development of algorithms that can effectively identify and take action upon members of the targeted subpopulation in a way that minimally compromises the privacy of the protected, while simultaneously limiting the expense of distinguishing members of the two groups via costly mechanisms such as surveillance, background checks, or medical testing. Within this framework, we provide provably privacy-preserving algorithms for targeted search in social networks. These algorithms are natural variants of common graph search methods, and ensure privacy for the protected by the careful injection of noise in the prioritization of potential targets. We validate the utility of our algorithms with extensive computational experiments on two large-scale social network datasets.
[1]
Avrim Blum,et al.
Differentially private data analysis of social networks via restricted sensitivity
,
2012,
ITCS '13.
[2]
Aaron Roth,et al.
Mechanism design in large games: incentives and privacy
,
2012,
ITCS.
[3]
Sofya Raskhodnikova,et al.
Analyzing Graphs with Node Differential Privacy
,
2013,
TCC.
[4]
Guy N. Rothblum,et al.
Boosting and Differential Privacy
,
2010,
2010 IEEE 51st Annual Symposium on Foundations of Computer Science.
[5]
Jonathan Katz,et al.
Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in Statistical Data Privacy
,
2013,
2013 IEEE 54th Annual Symposium on Foundations of Computer Science.
[6]
Ashwin Machanavajjhala,et al.
Blowfish privacy: tuning privacy-utility trade-offs using policies
,
2013,
SIGMOD Conference.
[7]
Daniel J. Solove.
A Taxonomy of Privacy
,
2006
.
[8]
Cynthia Dwork,et al.
Calibrating Noise to Sensitivity in Private Data Analysis
,
2006,
TCC.
[9]
David D. Jensen,et al.
Accurate Estimation of the Degree Distribution of Private Networks
,
2009,
2009 Ninth IEEE International Conference on Data Mining.
[10]
Jon Kleinberg,et al.
Maximizing the spread of influence through a social network
,
2003,
KDD '03.
[11]
Anne-Marie Kermarrec,et al.
Heterogeneous Differential Privacy
,
2015,
J. Priv. Confidentiality.
[12]
Aaron Roth,et al.
The Algorithmic Foundations of Differential Privacy
,
2014,
Found. Trends Theor. Comput. Sci..