Thresholded Lasso for high dimensional variable selection and statistical estimation
暂无分享,去创建一个
[1] Stanislaw J. Szarek,et al. Condition numbers of random matrices , 1991, J. Complex..
[2] I. Johnstone,et al. Ideal spatial adaptation by wavelet shrinkage , 1994 .
[3] Dean P. Foster,et al. The risk inflation criterion for multiple regression , 1994 .
[4] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[5] P. Massart,et al. From Model Selection to Adaptive Estimation , 1997 .
[6] Michael A. Saunders,et al. Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..
[7] P. Massart,et al. Risk bounds for model selection via penalization , 1999 .
[8] P. Massart,et al. Gaussian model selection , 2001 .
[9] R. Tibshirani,et al. Least angle regression , 2004, math/0406456.
[10] Y. Ritov,et al. Persistence in high-dimensional linear predictor selection and the virtue of overparametrization , 2004 .
[11] E. Candès,et al. Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.
[12] Emmanuel J. Candès,et al. Decoding by linear programming , 2005, IEEE Transactions on Information Theory.
[13] D. Donoho. For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .
[14] D. Donoho. For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution , 2006 .
[15] S. Mendelson,et al. Uniform Uncertainty Principle for Bernoulli and Subgaussian Ensembles , 2006, math/0608665.
[16] N. Meinshausen,et al. High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.
[17] Martin J. Wainwright,et al. Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.
[18] H. Zou. The Adaptive Lasso and Its Oracle Properties , 2006 .
[19] David L Donoho,et al. Compressed sensing , 2006, IEEE Transactions on Information Theory.
[20] Emmanuel J. Candès,et al. Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.
[21] M. Rudelson,et al. Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements , 2006, 2006 40th Annual Conference on Information Sciences and Systems.
[22] Peng Zhao,et al. On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..
[23] Florentina Bunea,et al. Sparse Density Estimation with l1 Penalties , 2007, COLT.
[24] A. Tsybakov,et al. Aggregation for Gaussian regression , 2007, 0710.3654.
[25] Terence Tao,et al. The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.
[26] Deanna Needell,et al. Signal recovery from incomplete and inaccurate measurements via ROMP , 2007 .
[27] A. Tsybakov,et al. Sparsity oracle inequalities for the Lasso , 2007, 0705.3308.
[28] S. Geer. HIGH-DIMENSIONAL GENERALIZED LINEAR MODELS AND THE LASSO , 2008, 0804.0703.
[29] Karim Lounici. Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators , 2008, 0801.4610.
[30] Cun-Hui Zhang,et al. Adaptive Lasso for sparse high-dimensional regression models , 2008 .
[31] R. DeVore,et al. A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .
[32] Cun-Hui Zhang,et al. The sparsity and bias of the Lasso selection in high-dimensional linear regression , 2008, 0808.0967.
[33] V. Koltchinskii. The Dantzig selector and sparsity oracle inequalities , 2009, 0909.0861.
[34] L. Wasserman,et al. HIGH DIMENSIONAL VARIABLE SELECTION. , 2007, Annals of statistics.
[35] J. Tropp,et al. CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, Commun. ACM.
[36] N. Meinshausen,et al. LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA , 2008, 0806.0145.
[37] R. Adamczak,et al. Restricted Isometry Property of Matrices with Independent Columns and Neighborly Polytopes by Random Sampling , 2009, 0904.4723.
[38] S. Geer,et al. On the conditions used to prove oracle results for the Lasso , 2009, 0910.0722.
[39] Martin J. Wainwright,et al. Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting , 2009, IEEE Trans. Inf. Theory.
[40] Martin J. Wainwright,et al. Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting , 2007, IEEE Transactions on Information Theory.
[41] E. Candès,et al. Near-ideal model selection by ℓ1 minimization , 2008, 0801.0345.
[42] Tong Zhang. Some sharp performance bounds for least squares regression with L1 regularization , 2009, 0908.2869.
[43] Shuheng Zhou,et al. Thresholding Procedures for High Dimensional Variable Selection and Statistical Estimation , 2009, NIPS.
[44] V. Koltchinskii. Sparsity in penalized empirical risk minimization , 2009 .
[45] Shuheng Zhou. Restricted Eigenvalue Conditions on Subgaussian Random Matrices , 2009, 0912.4045.
[46] P. Bickel,et al. SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.
[47] Martin J. Wainwright,et al. Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso) , 2009, IEEE Transactions on Information Theory.
[48] S. Geer,et al. Adaptive Lasso for High Dimensional Regression and Gaussian Graphical Modeling , 2009, 0903.2515.
[49] S. Geer,et al. The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso) , 2011 .
[50] J. Lafferty,et al. High-dimensional Ising model selection using ℓ1-regularized logistic regression , 2010, 1010.0311.
[51] Lie Wang,et al. Stable Recovery of Sparse Signals and an Oracle Inequality , 2010, IEEE Transactions on Information Theory.
[52] Sara van de Geer,et al. Prediction and variable selection with the adaptive Lasso , 2010 .
[53] Deanna Needell,et al. CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.
[54] Martin J. Wainwright,et al. Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$ -Balls , 2009, IEEE Transactions on Information Theory.