High-dimensional support union recovery in multivariate regression
暂无分享,去创建一个
[1] Michael I. Jordan,et al. Multiple kernel learning, conic duality, and the SMO algorithm , 2004, ICML.
[2] Charles A. Micchelli,et al. Learning the Kernel Function via Regularization , 2005, J. Mach. Learn. Res..
[3] Michael Elad,et al. Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.
[4] Martin J. Wainwright,et al. Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.
[5] Joel A. Tropp,et al. Just relax: convex programming methods for identifying sparse signals in noise , 2006, IEEE Transactions on Information Theory.
[6] M. Yuan,et al. Model selection and estimation in regression with grouped variables , 2006 .
[7] Peng Zhao,et al. On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..
[8] P. Zhao,et al. Grouped and Hierarchical Model Selection through Composite Absolute Penalties , 2007 .
[9] Larry A. Wasserman,et al. SpAM: Sparse Additive Models , 2007, NIPS.
[10] A. Rinaldo,et al. On the asymptotic properties of the group lasso estimator for linear models , 2008 .
[11] Francis R. Bach,et al. Consistency of the group Lasso and multiple kernel learning , 2007, J. Mach. Learn. Res..
[12] Han Liu,et al. On the ℓ 1 -ℓ q Regularized Regression , 2008 .
[13] P. Bühlmann,et al. The group lasso for logistic regression , 2008 .
[14] Ben Taskar,et al. Joint covariate selection and joint subspace selection for multiple classification problems , 2010, Stat. Comput..