The Generalized Lasso With Non-Linear Observations
暂无分享,去创建一个
[1] Gitta Kutyniok,et al. 1 . 2 Sparsity : A Reasonable Assumption ? , 2012 .
[2] Ewout van den Berg,et al. 1-Bit Matrix Completion , 2012, ArXiv.
[3] Joel A. Tropp,et al. Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.
[4] David L. Donoho,et al. Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing , 2009, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.
[5] Yaniv Plan,et al. Dimension Reduction by Random Hyperplane Tessellations , 2014, Discret. Comput. Geom..
[6] A. Juditsky,et al. Direct estimation of the index coefficient in a single-index model , 2001 .
[7] J. Horowitz. Semiparametric and Nonparametric Methods in Econometrics , 2007 .
[8] Christos Thrampoulidis,et al. The squared-error of generalized LASSO: A precise analysis , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[9] M. Wainwright. Structured Regularizers for High-Dimensional Problems: Statistical and Computational Issues , 2014 .
[10] R. Vershynin. Lectures in Geometric Functional Analysis , 2012 .
[11] Christos Thrampoulidis,et al. Simple error bounds for regularized noisy linear inverse problems , 2014, 2014 IEEE International Symposium on Information Theory.
[12] P. Bickel,et al. SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.
[13] D. Balding,et al. Structured Regularizers for High-Dimensional Problems : Statistical and Computational Issues , 2014 .
[14] M. Wakin. Manifold-Based Signal Recovery and Parameter Estimation from Compressive Measurements , 2010, 1002.1247.
[15] S. Mendelson,et al. Reconstruction and Subgaussian Operators in Asymptotic Geometric Analysis , 2007 .
[16] Yaniv Plan,et al. One-bit compressed sensing with non-Gaussian measurements , 2012, ArXiv.
[17] Yaniv Plan,et al. Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach , 2012, IEEE Transactions on Information Theory.
[18] Joel A. Tropp,et al. Living on the edge: A geometric theory of phase transitions in convex optimization , 2013, ArXiv.
[19] Holger Rauhut,et al. A Mathematical Introduction to Compressive Sensing , 2013, Applied and Numerical Harmonic Analysis.
[20] R. Ambartzumian. Stochastic and integral geometry , 1987 .
[21] Joel A. Tropp,et al. Convex recovery of a structured signal from independent random linear measurements , 2014, ArXiv.
[22] Y. Plan,et al. High-dimensional estimation with geometric constraints , 2014, 1404.3749.
[23] E. Greenshtein. Best subset selection, persistence in high-dimensional statistical learning and optimization under l1 constraint , 2006, math/0702684.
[24] Yonina C. Eldar,et al. Uniqueness conditions for low-rank matrix recovery , 2011, Optical Engineering + Applications.
[25] Christos Thrampoulidis,et al. LASSO with Non-linear Measurements is Equivalent to One With Linear Measurements , 2015, NIPS.
[26] A. Tsybakov,et al. Aggregation for Gaussian regression , 2007, 0710.3654.
[27] Martin J. Wainwright,et al. A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers , 2009, NIPS.
[28] Richard G. Baraniuk,et al. Random Projections of Smooth Manifolds , 2009, Found. Comput. Math..
[29] S. Geer,et al. On the conditions used to prove oracle results for the Lasso , 2009, 0910.0722.
[30] R. Tibshirani,et al. A SIGNIFICANCE TEST FOR THE LASSO. , 2013, Annals of statistics.
[31] S. Mendelson,et al. Empirical processes and random projections , 2005 .
[32] Yu. I. Ingster,et al. Statistical inference in compound functional models , 2012, 1208.6402.
[33] Michael B. Wakin,et al. Stable manifold embeddings with operators satisfying the Restricted Isometry Property , 2011, 2011 45th Annual Conference on Information Sciences and Systems.
[34] M. Rudelson,et al. On sparse reconstruction from Fourier and Gaussian measurements , 2008 .
[35] Raja Giryes,et al. On the Effective Measure of Dimension in the Analysis Cosparse Model , 2014, IEEE Transactions on Information Theory.
[36] Michael B. Wakin,et al. New Analysis of Manifold Embeddings and Signal Recovery from Compressive Measurements , 2013, ArXiv.
[37] Francis R. Bach,et al. Trace Lasso: a trace norm regularization for correlated designs , 2011, NIPS.
[38] S. Mendelson,et al. Learning subgaussian classes : Upper and minimax bounds , 2013, 1305.4825.
[39] G. Schechtman. Two observations regarding embedding subsets of Euclidean spaces in normed spaces , 2006 .
[40] Pierre Alquier,et al. Sparse single-index model , 2011, J. Mach. Learn. Res..
[41] Shahar Mendelson,et al. Learning without Concentration , 2014, COLT.
[42] T. Blumensath,et al. Theory and Applications , 2011 .
[43] Christos Thrampoulidis,et al. Simple Bounds for Noisy Linear Inverse Problems with Exact Side Information , 2013, ArXiv.
[44] Y. Gordon. On Milman's inequality and random subspaces which escape through a mesh in ℝ n , 1988 .
[45] Emmanuel J. Candès,et al. Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements , 2011, IEEE Transactions on Information Theory.
[46] D. Brillinger. A Generalized Linear Model With “Gaussian” Regressor Variables , 2012 .
[47] Mihailo Stojnic,et al. Various thresholds for ℓ1-optimization in compressed sensing , 2009, ArXiv.
[48] Pablo A. Parrilo,et al. The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.
[49] Roman Vershynin,et al. Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.
[50] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[51] Christophe Guyeux,et al. Using the LASSO for gene selection in bladder cancer data , 2015, 1504.05004.
[52] J. Jahn. Introduction to the Theory of Nonlinear Optimization , 1994 .
[53] E. Candès,et al. Near-ideal model selection by ℓ1 minimization , 2008, 0801.0345.