J an 2 00 8 Near-ideal model selection by l 1 minimization
暂无分享,去创建一个
[1] Emmanuel J. Candès,et al. Quantitative Robust Uncertainty Principles and Optimally Sparse Decompositions , 2004, Found. Comput. Math..
[2] Terence Tao,et al. The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.
[3] Y. Ritov,et al. Persistence in high-dimensional linear predictor selection and the virtue of overparametrization , 2004 .
[4] D. Steinberg,et al. Technometrics , 2008 .
[5] D. M. Titterington,et al. Neural Networks: A Review from a Statistical Perspective , 1994 .
[6] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[7] P. Bickel,et al. SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.
[8] P. Massart,et al. Gaussian model selection , 2001 .
[9] Balas K. Natarajan,et al. Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..
[10] S. Mallat. A wavelet tour of signal processing , 1998 .
[11] E. Candès,et al. Curvelets: A Surprisingly Effective Nonadaptive Representation for Objects with Edges , 2000 .
[12] E. Candès,et al. New tight frames of curvelets and optimal representations of objects with piecewise C2 singularities , 2004 .
[13] Dean P. Foster,et al. The risk inflation criterion for multiple regression , 1994 .
[14] Martin J. Wainwright,et al. Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.
[15] E. Greenshtein. Best subset selection, persistence in high-dimensional statistical learning and optimization under l1 constraint , 2006, math/0702684.
[16] A. Tsybakov,et al. Aggregation for Gaussian regression , 2007, 0710.3654.
[17] N. Meinshausen,et al. High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.
[18] N. Meinshausen,et al. LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA , 2008, 0806.0145.
[19] Peng Zhao,et al. On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..
[20] P. Massart,et al. Risk bounds for model selection via penalization , 1999 .
[21] A. Tsybakov,et al. Sparsity oracle inequalities for the Lasso , 2007, 0705.3308.
[22] H. Akaike. A new look at the statistical model identification , 1974 .
[23] Tong Zhang. Some sharp performance bounds for least squares regression with L1 regularization , 2009, 0908.2869.
[24] G. Schwarz. Estimating the Dimension of a Model , 1978 .
[25] Michael Elad,et al. Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.
[26] Michael A. Saunders,et al. Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..