PULP: Achieving Privacy and Utility Trade-Off in User Mobility Data

Leveraging location information in location-based services leads to improving service utility through geocontextualization. However, this raises privacy concerns as new knowledge can be inferred from location records, such as user's home and work places, or personal habits. Although Location Privacy Protection Mechanisms (LPPMs) provide a means to tackle this problem, they often require manual configuration posing significant challenges to service providers and users. Moreover, their impact on data privacy and utility is seldom assessed. In this paper, we present PULP, a model-driven system which automatically provides user-specific privacy protection and contributes to service utility via choosing adequate LPPM and configuring it. At the heart of PULP is nonlinear models that can capture the complex dependency of data privacy and utility for each individual user under given LPPM considered, i.e., Geo-Indistinguishability and Promesse. According to users' preferences on privacy and utility, PULP efficiently recommends suitable LPPM and corresponding configuration. We evaluate the accuracy of PULP's models and its effectiveness to achieve the privacy-utility trade-off per user, using four real-world mobility traces of 770 users in total. Our extensive experimentation shows that PULP ensures the contribution to location service while adhering to privacy constraints for a great percentage of users, and is orders of magnitude faster than non-model based alternatives.

[1]  Catuscia Palamidessi,et al.  Geo-indistinguishability: differential privacy for location-based systems , 2012, CCS.

[2]  Sébastien Gambs,et al.  De-anonymization Attack on Geolocated Data , 2013, 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications.

[3]  D. Gática-Pérez,et al.  Towards rich mobile phone datasets: Lausanne data collection campaign , 2010 .

[4]  Catuscia Palamidessi,et al.  Constructing elastic distinguishability metrics for location privacy , 2015, Proc. Priv. Enhancing Technol..

[5]  Latanya Sweeney,et al.  k-Anonymity: A Model for Protecting Privacy , 2002, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[6]  John Krumm,et al.  Inference Attacks on Location Tracks , 2007, Pervasive.

[7]  Stéphane Bressan,et al.  Publishing trajectories with differential privacy guarantees , 2013, SSDBM.

[8]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[9]  Lionel Brunie,et al.  Differentially Private Location Privacy in Practice , 2014, ArXiv.

[10]  Cédric Lauradoux,et al.  Time Distortion Anonymization for the Publication of Mobility Data with High Utility , 2015, TrustCom 2015.

[11]  Hervé Rivano,et al.  PRIVA'MOV: Analysing Human Mobility Through Multi-Sensor Datasets , 2017 .

[12]  Panos Kalnis,et al.  PRIVE: anonymous location-based queries in distributed mobile systems , 2007, WWW '07.

[13]  Sébastien Gambs,et al.  Show me how you move and I will tell you who you are , 2010, SPRINGL '10.

[14]  Ling Liu,et al.  Location Privacy in Mobile Systems: A Personalized Anonymization Model , 2005, 25th IEEE International Conference on Distributed Computing Systems (ICDCS'05).

[15]  Xing Xie,et al.  Mining interesting locations and travel sequences from GPS trajectories , 2009, WWW '09.

[16]  Karl Aberer,et al.  User-side adaptive protection of location privacy in participatory sensing , 2013, GeoInformatica.

[17]  Imad Aad,et al.  From big smartphone data to worldwide research: The Mobile Data Challenge , 2013, Pervasive Mob. Comput..

[18]  Francesco Bonchi,et al.  Never Walk Alone: Uncertainty for Anonymity in Moving Objects Databases , 2008, 2008 IEEE 24th International Conference on Data Engineering.