Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent
暂无分享,去创建一个
[1] Alexander Gammerman,et al. Ridge Regression Learning Algorithm in Dual Variables , 1998, ICML.
[2] Nathan Halko,et al. Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..
[3] Dean P. Foster,et al. Faster Ridge Regression via the Subsampled Randomized Hadamard Transform , 2013, NIPS.
[4] Isabelle Guyon,et al. Design of experiments for the NIPS 2003 variable selection benchmark , 2003 .
[5] Bing Li,et al. ON PRINCIPAL COMPONENTS AND REGRESSION: A STATISTICAL EXPLANATION OF A NATURAL PHENOMENON , 2009 .
[6] Nathan Halko,et al. An Algorithm for the Principal Component Analysis of Large Data Sets , 2010, SIAM J. Sci. Comput..
[7] Steve R. Gunn,et al. Result Analysis of the NIPS 2003 Feature Selection Challenge , 2004, NIPS.
[8] Tong Zhang,et al. Solving large scale linear prediction problems using stochastic gradient descent algorithms , 2004, ICML.
[9] Tong Zhang,et al. Accelerating Stochastic Gradient Descent using Predictive Variance Reduction , 2013, NIPS.
[10] Martin Guha,et al. Encyclopedia of Statistics in Behavioral Science , 2006 .
[11] Léon Bottou,et al. Large-Scale Machine Learning with Stochastic Gradient Descent , 2010, COMPSTAT.
[12] Sham M. Kakade,et al. A risk comparison of ordinary least squares vs ridge regression , 2011, J. Mach. Learn. Res..