Improved Algorithms for Differentially Private Orthogonal Tensor Decomposition

Tensor decompositions have applications in many areas including signal processing, machine learning, computer vision and neuroscience. In this paper, we propose two new differentially private algorithms for orthogonal decomposition of symmetric tensors from private or sensitive data; these arise in applications such as latent variable models. Differential privacy is a formal privacy framework that guarantees protections against adversarial inference. We investigate the performance of these algorithms with varying privacy and database parameters and compare against another recently proposed privacy-preserving algorithm. Our experiments show that the proposed algorithms provide very good utility even while preserving strict privacy guarantees.

[1]  Anima Anandkumar,et al.  Online and Differentially-Private Tensor Decomposition , 2016, NIPS.

[2]  Cynthia Dwork,et al.  Practical privacy: the SuLQ framework , 2005, PODS.

[3]  D. von Rosen,et al.  More on the Kronecker Structured Covariance Matrix , 2012 .

[4]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[5]  Dietrich von Rosen,et al.  The multilinear normal distribution: Introduction and some basic properties , 2013, J. Multivar. Anal..

[6]  Gene H. Golub,et al.  Symmetric Tensors and Symmetric Tensor Rank , 2008, SIAM J. Matrix Anal. Appl..

[7]  Anand D. Sarwate,et al.  Differentially Private Empirical Risk Minimization , 2009, J. Mach. Learn. Res..

[8]  Tamara G. Kolda,et al.  Symmetric Orthogonal Tensor Decomposition is Trivial , 2015, ArXiv.

[9]  Richard A. Harshman,et al.  Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis , 1970 .

[10]  Li Zhang,et al.  Analyze gauss: optimal bounds for privacy-preserving principal component analysis , 2014, STOC.

[11]  Rama Chellappa,et al.  Sparse Embedding: A Framework for Sparsity Promoting Dimensionality Reduction , 2012, ECCV.

[12]  Sham M. Kakade,et al.  Learning mixtures of spherical gaussians: moment methods and spectral decompositions , 2012, ITCS '13.

[13]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[14]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[15]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[16]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[17]  Dean Alderucci A SPECTRAL ALGORITHM FOR LEARNING HIDDEN MARKOV MODELS THAT HAVE SILENT STATES , 2015 .

[18]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[19]  Furong Huang Latent Dirichlet Allocation via Tensor Factorization , 2014 .

[20]  J. Chang,et al.  Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition , 1970 .