Online and Differentially-Private Tensor Decomposition

In this paper, we resolve many of the key algorithmic questions regarding robustness, memory efficiency, and differential privacy of tensor decomposition. We propose simple variants of the tensor power method which enjoy these strong properties. We present the first guarantees for online tensor power method which has a linear memory requirement. Moreover, we present a noise calibrated tensor power method with efficient privacy guarantees. At the heart of all these guarantees lies a careful perturbation analysis derived in this paper which improves up on the existing results significantly.

[1]  I. Ibragimov,et al.  Norms of Gaussian sample functions , 1976 .

[2]  G. Stewart,et al.  Matrix Perturbation Theory , 1990 .

[3]  P. Massart,et al.  Adaptive estimation of a quadratic functional by model selection , 2000 .

[4]  L. Birge,et al.  An alternative point of view on Lepski's method , 2001 .

[5]  Tamara G. Kolda,et al.  Categories and Subject Descriptors: G.4 [Mathematics of Computing]: Mathematical Software— , 2022 .

[6]  P. Massart,et al.  Concentration inequalities and model selection , 2007 .

[7]  Sham M. Kakade,et al.  A tail inequality for quadratic forms of subgaussian random vectors , 2011, ArXiv.

[8]  Tamara G. Kolda,et al.  Shifted Power Method for Computing Tensor Eigenpairs , 2010, SIAM J. Matrix Anal. Appl..

[9]  Toniann Pitassi,et al.  Learning Fair Representations , 2013, ICML.

[10]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[11]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[12]  Andrea Montanari,et al.  A statistical model for tensor PCA , 2014, NIPS.

[13]  Jun Zhu,et al.  Spectral Methods for Supervised Topic Models , 2014, NIPS.

[14]  Ryota Tomioka,et al.  Spectral norm of random tensors , 2014, 1407.1870.

[15]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[16]  Li Zhang,et al.  Analyze gauss: optimal bounds for privacy-preserving principal component analysis , 2014, STOC.

[17]  Anima Anandkumar,et al.  A Spectral Algorithm for Latent Dirichlet Allocation , 2012, Algorithmica.

[18]  Moritz Hardt,et al.  The Noisy Power Method: A Meta Algorithm with Applications , 2013, NIPS.

[19]  Donald Goldfarb,et al.  Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors , 2015, SIAM J. Matrix Anal. Appl..

[20]  Alexander J. Smola,et al.  Fast and Guaranteed Tensor Decomposition via Sketching , 2015, NIPS.

[21]  Percy Liang,et al.  Tensor Factorization via Matrix Factorization , 2015, AISTATS.

[22]  Jonathan Shi,et al.  Tensor principal component analysis via sum-of-square proofs , 2015, COLT.

[23]  Anima Anandkumar,et al.  Online tensor methods for learning latent variable models , 2013, J. Mach. Learn. Res..

[24]  Furong Huang,et al.  Escaping From Saddle Points - Online Stochastic Gradient for Tensor Decomposition , 2015, COLT.

[25]  Anima Anandkumar,et al.  Learning Overcomplete Latent Variable Models through Tensor Methods , 2014, COLT.

[26]  Maria-Florina Balcan,et al.  An Improved Gap-Dependency Analysis of the Noisy Power Method , 2016, COLT.

[27]  Kamyar Azizzadenesheli,et al.  Reinforcement Learning in Rich-Observation MDPs using Spectral Methods , 2016, 1611.03907.

[28]  Anima Anandkumar,et al.  Beating the Perils of Non-Convexity: Guaranteed Training of Neural Networks using Tensor Methods , 2017 .

[29]  Furong Huang,et al.  Scalable Latent Tree Model and its Application to Health Analytics , 2017 .