New Subsampling Algorithms for Fast Least Squares Regression
暂无分享,去创建一个
Dean P. Foster | Lyle H. Ungar | Yichao Lu | Paramveer S. Dhillon | Dean Phillips Foster | L. Ungar | Y. Lu
[1] Bernard Chazelle,et al. Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform , 2006, STOC '06.
[2] Michael W. Mahoney. Randomized Algorithms for Matrices and Data , 2011, Found. Trends Mach. Learn..
[3] Dean P. Foster,et al. Two Step CCA: A new spectral method for estimating vector models of words , 2012, ICML 2012.
[4] V. Rokhlin,et al. A fast randomized algorithm for overdetermined linear least-squares regression , 2008, Proceedings of the National Academy of Sciences.
[5] Gene H. Golub,et al. Matrix computations (3rd ed.) , 1996 .
[6] Dean P. Foster,et al. Multi-View Learning of Word Embeddings via CCA , 2011, NIPS.
[7] S. Muthukrishnan,et al. Faster least squares approximation , 2007, Numerische Mathematik.
[8] R. Vershynin. How Close is the Sample Covariance Matrix to the Actual Covariance Matrix? , 2010, 1004.3484.
[9] Christos Boutsidis,et al. Improved Matrix Algorithms via the Subsampled Randomized Hadamard Transform , 2012, SIAM J. Matrix Anal. Appl..
[10] Shai Shalev-Shwartz,et al. Stochastic dual coordinate ascent methods for regularized loss , 2012, J. Mach. Learn. Res..
[11] Roman Vershynin,et al. Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.
[12] Mark Tygert,et al. A fast algorithm for computing minimal-norm solutions to underdetermined systems of linear equations , 2009, ArXiv.
[13] Sivan Toledo,et al. Blendenpik: Supercharging LAPACK's Least-Squares Solver , 2010, SIAM J. Sci. Comput..