Time Series Sanitization with Metric-Based Privacy

The increasing popularity of connected devices has given rise to the vast generation of time series data. Due to consumer privacy concerns, the data collected from individual devices must be sanitized before sharing with untrusted third-parties. However, existing time series privacy solutions do not provide provable guarantees for individual time series and may not extend to data from a wide range of application domains. In this paper, we adopt a generalized privacy notion based on differential privacy for individual time series sanitization and Discrete Cosine Transform to model the characteristics of time series data. We extend previously reported 2-dimensional results to arbitrary k-dimensional space. Empirical evaluation with various datasets demonstrates the applicability of our proposed method with the standard mean squared error (MSE) and in classification tasks.

[1]  Catuscia Palamidessi,et al.  Broadening the Scope of Differential Privacy Using Metrics , 2013, Privacy Enhancing Technologies.

[2]  Prashant J. Shenoy,et al.  Private memoirs of a smart meter , 2010, BuildSys '10.

[3]  Philip S. Yu,et al.  Time Series Compressibility and Privacy , 2007, VLDB.

[4]  Claude Castelluccia,et al.  I Have a DREAM! (DiffeRentially privatE smArt Metering) , 2011, Information Hiding.

[5]  Li Xiong,et al.  Protecting Locations with Differential Privacy under Temporal Correlations , 2014, CCS.

[6]  Suman Nath,et al.  MaskIt: privately releasing user context streams for personalized mobile applications , 2012, SIGMOD Conference.

[7]  Li Wei,et al.  Fast time series classification using numerosity reduction , 2006, ICML.

[8]  Li Xiong,et al.  An Adaptive Approach to Real-Time Aggregate Monitoring With Differential Privacy , 2014, IEEE Trans. Knowl. Data Eng..

[9]  Adam Thierer,et al.  The Internet of Things and Wearable Technology: Addressing Privacy and Security Concerns without Derailing Innovation , 2015 .

[10]  Jeffrey F. Naughton,et al.  Utility-maximizing event stream suppression , 2013, SIGMOD '13.

[11]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[12]  Hongxia Jin,et al.  An Information-Theoretic Approach to Individual Sequential Data Sanitization , 2016, WSDM.

[13]  Kamalika Chaudhuri,et al.  Composition properties of inferential privacy for time-series data , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[14]  Catuscia Palamidessi,et al.  Geo-indistinguishability: differential privacy for location-based systems , 2012, CCS.

[15]  Suman Nath,et al.  Differentially private aggregation of distributed time-series with transformation and encryption , 2010, SIGMOD Conference.

[16]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[17]  Elaine Shi,et al.  Privacy-Preserving Aggregation of Time-Series Data , 2011, NDSS.

[18]  Stavros Papadopoulos,et al.  Differentially Private Event Sequences over Infinite Streams , 2014, Proc. VLDB Endow..

[19]  P. Lichtenstein,et al.  Association of Resting Heart Rate and Blood Pressure in Late Adolescence With Subsequent Mental Disorders: A Longitudinal Population Study of More Than 1 Million Men in Sweden. , 2016, JAMA psychiatry.