On Choosing Between Privacy Preservation Mechanisms for Mobile Trajectory Data Sharing

Various notions of privacy preservation have been proposed for mobile trajectory data sharing/publication. The privacy guarantees provided by these approaches are theoretically very different and cannot be directly compared against each other. They are motivated by different adversary models, making varying assumptions about adversary’s background knowledge and intention. A clear comparison between existing mechanisms is missing, making it difficult when a data aggregator/owner needs to pick a mechanism for a given application scenario. We seek to fill this gap by proposing a measure called STRAP that allows comparison of different trajectory privacy mechanisms on a common scale. We also study the trade-off between privacy and utility i.e., how different mechanisms perform when utility constraints are imposed over them. Using STRAP over two real mobile trajectory datasets, we compare state of the art mechanisms for trajectory data privacy and demonstrate the value of the proposed measure.

[1]  Mahesh K. Marina,et al.  On the Inference of User Paths from Anonymized Mobility Data , 2016, 2016 IEEE European Symposium on Security and Privacy (EuroS&P).

[2]  Hadi Hemmati,et al.  The Risk-Utility Tradeoff for Data Privacy Models , 2016, 2016 8th IFIP International Conference on New Technologies, Mobility and Security (NTMS).

[3]  Eamonn Keogh Exact Indexing of Dynamic Time Warping , 2002, VLDB.

[4]  Divesh Srivastava,et al.  DPT: Differentially Private Trajectory Synthesis Using Hierarchical Reference Systems , 2015, Proc. VLDB Endow..

[5]  César A. Hidalgo,et al.  Unique in the Crowd: The privacy bounds of human mobility , 2013, Scientific Reports.

[6]  Carl A. Gunter,et al.  Plausible Deniability for Privacy-Preserving Data Synthesis , 2017, Proc. VLDB Endow..

[7]  Charles F. Hockett,et al.  A mathematical theory of communication , 1948, MOCO.

[8]  Jean-Yves Le Boudec,et al.  Quantifying Location Privacy , 2011, 2011 IEEE Symposium on Security and Privacy.

[9]  C. Robert,et al.  Bayesian estimation of hidden Markov chains: a stochastic implementation , 1993 .

[10]  Ting Yu,et al.  Empirical privacy and empirical utility of anonymized data , 2013, 2013 IEEE 29th International Conference on Data Engineering Workshops (ICDEW).

[11]  David J. DeWitt,et al.  Mondrian Multidimensional K-Anonymity , 2006, 22nd International Conference on Data Engineering (ICDE'06).

[12]  Hui Zang,et al.  Anonymization of location data does not work: a large-scale measurement study , 2011, MobiCom.

[13]  Margaret Martonosi,et al.  DP-WHERE: Differentially private modeling of human mobility , 2013, 2013 IEEE International Conference on Big Data.

[14]  Christos Faloutsos,et al.  Efficient retrieval of similar time sequences under time warping , 1998, Proceedings 14th International Conference on Data Engineering.

[15]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[16]  Pierangela Samarati,et al.  Location privacy in pervasive computing , 2008 .

[17]  Claude Castelluccia,et al.  Differentially private sequential data publication via variable-length n-grams , 2012, CCS.

[18]  Marco Fiore,et al.  Hiding mobile traffic fingerprints with GLOVE , 2015, CoNEXT.

[19]  Marco Fiore,et al.  Preserving mobile subscriber privacy in open datasets of spatiotemporal trajectories , 2017, IEEE INFOCOM 2017 - IEEE Conference on Computer Communications.

[20]  Johannes Gehrke,et al.  Differential privacy via wavelet transforms , 2009, 2010 IEEE 26th International Conference on Data Engineering (ICDE 2010).

[21]  Chi-Yin Chow,et al.  Trajectory privacy in location-based services and data publication , 2011, SKDD.

[22]  Johannes Gehrke,et al.  iReduct: differential privacy with reduced relative errors , 2011, SIGMOD '11.

[23]  Matthias Grossglauser,et al.  CRAWDAD dataset epfl/mobility (v.2009-02-24) , 2009 .