暂无分享,去创建一个
[1] Sham M. Kakade,et al. A spectral algorithm for learning Hidden Markov Models , 2008, J. Comput. Syst. Sci..
[2] Anima Anandkumar,et al. Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..
[3] Aditya Bhaskara,et al. Uniqueness of Tensor Decompositions with Applications to Polynomial Identifiability , 2013, COLT.
[4] Jean-Francois Cardoso,et al. Super-symmetric decomposition of the fourth-order cumulant tensor. Blind identification of more sources than sensors , 1991, [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing.
[5] Giorgio Ottaviani,et al. On Generic Identifiability of 3-Tensors of Small Rank , 2011, SIAM J. Matrix Anal. Appl..
[6] Tselil Schramm,et al. Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors , 2015, STOC.
[7] Tselil Schramm,et al. A Robust Spectral Algorithm for Overcomplete Tensor Decomposition , 2019, COLT.
[8] Elchanan Mossel,et al. Learning nonsingular phylogenies and hidden Markov models , 2005, STOC '05.
[9] Lieven De Lathauwer,et al. Fourth-Order Cumulant-Based Blind Identification of Underdetermined Mixtures , 2007, IEEE Transactions on Signal Processing.
[10] David P. Woodruff,et al. Relative Error Tensor Low Rank Approximation , 2017, Electron. Colloquium Comput. Complex..
[11] Ankur Moitra,et al. Algorithmic Aspects of Machine Learning , 2018 .
[12] C. Matias,et al. Identifiability of parameters in latent structure models with many observed variables , 2008, 0809.5032.
[13] Tamara G. Kolda,et al. Tensor Decompositions and Applications , 2009, SIAM Rev..
[14] Sanjoy Dasgupta,et al. Maximum Likelihood Estimation for Mixtures of Spherical Gaussians is NP-hard , 2017, J. Mach. Learn. Res..
[15] Mikhail Belkin,et al. The More, the Merrier: the Blessing of Dimensionality for Learning Large Gaussian Mixtures , 2013, COLT.
[16] Christopher J. Hillar,et al. Most Tensor Problems Are NP-Hard , 2009, JACM.
[17] J. Kruskal. Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics , 1977 .
[18] Aditya Bhaskara,et al. Smoothed Analysis in Unsupervised Learning via Decoupling , 2018, 2019 IEEE 60th Annual Symposium on Foundations of Computer Science (FOCS).
[19] Johan Håstad,et al. Tensor Rank is NP-Complete , 1989, ICALP.
[20] Tengyu Ma,et al. Polynomial-Time Tensor Decompositions with Sum-of-Squares , 2016, 2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS).
[21] Ankur Moitra,et al. Settling the Polynomial Learnability of Mixtures of Gaussians , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.
[22] Sham M. Kakade,et al. Learning mixtures of spherical gaussians: moment methods and spectral decompositions , 2012, ITCS '13.
[23] Kane,et al. Beyond the Worst-Case Analysis of Algorithms , 2020 .
[24] Qingqing Huang,et al. Learning Mixtures of Gaussians in High Dimensions , 2015, STOC.
[25] Santosh S. Vempala,et al. Smoothed Analysis of Discrete Tensor Decomposition and Assemblies of Neurons , 2018, NeurIPS.
[26] Anima Anandkumar,et al. Analyzing Tensor Power Method Dynamics in Overcomplete Regime , 2014, J. Mach. Learn. Res..
[27] Aditya Bhaskara,et al. Smoothed analysis of tensor decompositions , 2013, STOC.
[28] V. N. Bogaevski,et al. Matrix Perturbation Theory , 1991 .
[29] Tengyu Ma,et al. On the optimization landscape of tensor decompositions , 2017, Mathematical Programming.
[30] Santosh S. Vempala,et al. Fourier PCA and robust tensor decomposition , 2013, STOC.
[31] Mikhail Belkin,et al. Polynomial Learning of Distribution Families , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.
[32] Richard A. Harshman,et al. Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis , 1970 .