Guaranteed Sparse Recovery under Linear Transformation
暂无分享,去创建一个
Jieping Ye | Ji Liu | Lei Yuan | Ji Liu | Jieping Ye | Lei Yuan
[1] Justin K. Romberg,et al. The Dantzig selector and generalized thresholding , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.
[2] Martin J. Wainwright,et al. Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso) , 2009, IEEE Transactions on Information Theory.
[3] Tong Zhang,et al. On the Consistency of Feature Selection using Greedy Least Squares Regression , 2009, J. Mach. Learn. Res..
[4] Yonina C. Eldar,et al. Compressed Sensing with Coherent and Redundant Dictionaries , 2010, ArXiv.
[5] Terence Tao,et al. The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.
[6] N. Meinshausen,et al. High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.
[7] R. Tibshirani,et al. Sparsity and smoothness via the fused lasso , 2005 .
[8] Michael Elad,et al. The Cosparse Analysis Model and Algorithms , 2011, ArXiv.
[9] A. Tsybakov,et al. Sparsity oracle inequalities for the Lasso , 2007, 0705.3308.
[10] Emmanuel J. Candès,et al. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.
[11] Martin J. Wainwright,et al. Model Selection in Gaussian Graphical Models: High-Dimensional Consistency of l1-regularized MLE , 2008, NIPS.
[12] R. Tibshirani,et al. PATHWISE COORDINATE OPTIMIZATION , 2007, 0708.1485.
[13] Alessandro Rinaldo,et al. Sparsistency of the Edge Lasso over Graphs , 2012, AISTATS.
[14] Mohamed-Jalal Fadili,et al. Robust Sparse Analysis Regularization , 2011, IEEE Transactions on Information Theory.
[15] Bin Yu,et al. Model Selection in Gaussian Graphical Models: High-Dimensional Consistency of boldmathell_1-regularized MLE , 2008, NIPS 2008.
[16] Tong Zhang. Some sharp performance bounds for least squares regression with L1 regularization , 2009, 0908.2869.
[17] Jieping Ye,et al. Dictionary LASSO: Guaranteed Sparse Recovery under Linear Transformation , 2013 .
[18] Jieping Ye,et al. A Multi-Stage Framework for Dantzig Selector and LASSO , 2012, J. Mach. Learn. Res..
[19] Emmanuel J. Candès,et al. Decoding by linear programming , 2005, IEEE Transactions on Information Theory.
[20] Le Song,et al. Sparsistent Learning of Varying-coefficient Models with Structural Changes , 2009, NIPS.
[21] Peng Zhao,et al. On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..
[22] Karim Lounici. Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators , 2008, 0801.4610.
[23] E. Candès,et al. Near-ideal model selection by ℓ1 minimization , 2008, 0801.0345.
[24] Michael Elad,et al. Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.
[25] A. Rinaldo. Properties and refinements of the fused lasso , 2008, 0805.0234.
[26] Tony F. Chan,et al. Total variation blind deconvolution , 1998, IEEE Trans. Image Process..
[27] Ming Yuan,et al. Sparse Recovery in Large Ensembles of Kernel Machines On-Line Learning and Bandits , 2008, COLT.