Privacy protection with heavy-tailed noise for linear dynamical systems

Abstract Privacy protection in linear dynamical systems is investigated in this paper. A standard mechanism employed in systems and control literature is to mask private data by adding Gaussian noise . A shortcoming of this mechanism is that the occurrence of outliers is vulnerable. The goal of this paper is to present a novel mechanism that can hide outliers. The key idea is to utilize stably distributed noise, which has the following two preferable properties for this purpose; One is its heavy-tailed distribution that is beneficial to hide extreme values including scale-free data. The other is its closedness with respect to addition similarly to Gaussian, which enables us to design filter/controller under privacy requirements based on linear control theory. From a theoretical point of view, we quantify the privacy level of the proposed mechanism in terms of differential privacy. The derivation is nontrivial because the density function of the stable distribution has no analytic expression.

[1]  Boi Faltings,et al.  Bayesian Differential Privacy for Machine Learning , 2019, ICML.

[2]  Magnus Egerstedt,et al.  Cloud-Enabled Differentially Private Multiagent Optimization With Constraints , 2015, IEEE Transactions on Control of Network Systems.

[3]  Khaled El Emam,et al.  The application of differential privacy to health data , 2012, EDBT-ICDT '12.

[4]  Albert,et al.  Emergence of scaling in random networks , 1999, Science.

[5]  Xinping Guan,et al.  Preserving Data-Privacy With Added Noises: Optimal Estimation and Privacy Analysis , 2017, IEEE Transactions on Information Theory.

[6]  Peng Liu,et al.  Secure Information Aggregation for Smart Grids Using Homomorphic Encryption , 2010, 2010 First IEEE International Conference on Smart Grid Communications.

[7]  Ken-iti Sato Lévy Processes and Infinitely Divisible Distributions , 1999 .

[8]  M. Taqqu,et al.  Stable Non-Gaussian Random Processes : Stochastic Models with Infinite Variance , 1995 .

[9]  John P. Nolan,et al.  Multivariate elliptically contoured stable distributions: theory and estimation , 2013, Computational Statistics.

[10]  V. Zolotarev One-dimensional stable distributions , 1986 .

[11]  Xinping Guan,et al.  Differential Private Noise Adding Mechanism and Its Application on Consensus Algorithm , 2016, IEEE Transactions on Signal Processing.

[12]  Ming Cao,et al.  Design of Privacy-Preserving Dynamic Controllers , 2019, IEEE Transactions on Automatic Control.

[13]  Pramod Viswanath,et al.  The Optimal Noise-Adding Mechanism in Differential Privacy , 2012, IEEE Transactions on Information Theory.

[14]  Philip S. Yu,et al.  A General Survey of Privacy-Preserving Data Mining Models and Algorithms , 2008, Privacy-Preserving Data Mining.

[15]  Ron J. Patton,et al.  Input Observability and Input Reconstruction , 1998, Autom..

[16]  Kenji Kashima,et al.  Stable Process Approach to Analysis of Systems Under Heavy-Tailed Noise: Modeling and Stochastic Linearization , 2019, IEEE Transactions on Automatic Control.

[17]  Kenji Kashima,et al.  Modular control under privacy protection: Fundamental trade-offs , 2021, Autom..

[18]  Vladimiro Sassone,et al.  Differentially Private Data Sharing in a Cloud Federation with Blockchain , 2018, IEEE Cloud Computing.

[19]  Roksana Boreli,et al.  Differential privacy in intelligent transportation systems , 2013, WiSec '13.

[20]  Richard J. Meinhold,et al.  Robustification of Kalman Filter Models , 1989 .

[21]  J. L. Nolan,et al.  Numerical calculation of stable densities and distribution functions: Heavy tails and highly volatil , 1997 .

[22]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[23]  George J. Pappas,et al.  Differentially Private Filtering , 2012, IEEE Transactions on Automatic Control.