Robust Structured Estimation with Single-Index Models

In this paper, we investigate general single-index models (SIMs) in high dimensions. Based on U -statistics, we propose two types of robust estimators for the recovery of model parameters, which can be viewed as generalizations of several existing algorithms for one-bit compressed sensing (1-bit CS). With minimal assumption on noise, the statistical guarantees are established for the generalized estimators under suitable conditions, which allow general structures of underlying parameter. Moreover, the proposed estimator is novelly instantiated for SIMs with monotone transfer function, and the obtained estimator can better leverage the monotonicity. Experimental results are provided to support our theoretical analyses.

[1]  H. Ichimura,et al.  SEMIPARAMETRIC LEAST SQUARES (SLS) AND WEIGHTED SLS ESTIMATION OF SINGLE-INDEX MODELS , 1993 .

[2]  Arindam Banerjee,et al.  Structured Estimation with Atomic Norms: General Bounds and Applications , 2015, NIPS.

[3]  Peter Radchenko,et al.  High dimensional single index models , 2015, J. Multivar. Anal..

[4]  M. Talagrand The Generic Chaining , 2005 .

[5]  Arindam Banerjee,et al.  Generalized Dantzig Selector: Application to the k-support norm , 2014, NIPS.

[6]  R. Vershynin Estimation in High Dimensions: A Geometric Perspective , 2014, 1405.5103.

[7]  Yonina C. Eldar,et al.  Sparse Nonlinear Regression: Parameter Estimation under Nonconvexity , 2016, ICML.

[8]  Ambuj Tewari,et al.  Learning Exponential Families in High-Dimensions: Strong Convexity and Sparsity , 2009, AISTATS.

[9]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[10]  Massimiliano Pontil,et al.  An Inequality with Applications to Structured Sparsity and Multitask Dictionary Learning , 2014, COLT.

[11]  Prateek Jain,et al.  One-Bit Compressed Sensing: Provable Support and Vector Recovery , 2013, ICML.

[12]  Richard G. Baraniuk,et al.  1-Bit compressive sensing , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.

[13]  Ping Li,et al.  One Scan 1-Bit Compressed Sensing , 2015, AISTATS.

[14]  Laurent Jacques,et al.  Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors , 2011, IEEE Transactions on Information Theory.

[15]  V. Koltchinskii,et al.  Oracle inequalities in empirical risk minimization and sparse recovery problems , 2011 .

[16]  Robert D. Nowak,et al.  Universal Measurement Bounds for Structured Sparse Signal Recovery , 2012, AISTATS.

[17]  Jinfeng Yi,et al.  Efficient Algorithms for Robust One-bit Compressive Sensing , 2014, ICML.

[18]  Ping Li,et al.  Linear signal recovery from b-bit-quantized linear measurements: precise analysis of the trade-off between bit depth and number of measurements , 2016, ArXiv.

[19]  Quanquan Gu,et al.  Towards a Lower Sample Complexity for Robust One-bit Compressed Sensing , 2015, ICML.

[20]  Joel A. Tropp,et al.  Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.

[21]  M. Talagrand Upper and Lower Bounds for Stochastic Processes , 2021, Ergebnisse der Mathematik und ihrer Grenzgebiete. 3. Folge / A Series of Modern Surveys in Mathematics.

[22]  Pierre Alquier,et al.  Sparse single-index model , 2011, J. Mach. Learn. Res..

[23]  Adam Tauman Kalai,et al.  The Isotron Algorithm: High-Dimensional Isotonic Regression , 2009, COLT.

[24]  Vidyashankar Sivakumar,et al.  Estimation with Norm Regularization , 2014, NIPS.

[25]  Ping Li,et al.  b-bit Marginal Regression , 2015, NIPS.

[26]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[27]  Constantine Caramanis,et al.  Optimal Linear Estimation under Unknown Nonlinear Transform , 2015, NIPS.

[28]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[29]  Sjoerd Dirksen,et al.  Dimensionality Reduction with Subgaussian Matrices: A Unified Theory , 2014, Foundations of Computational Mathematics.

[30]  P. McCullagh,et al.  Generalized Linear Models , 1984 .

[31]  Arindam Banerjee,et al.  One-bit Compressed Sensing with the k-Support Norm , 2015, AISTATS.

[32]  Y. Plan,et al.  High-dimensional estimation with geometric constraints , 2014, 1404.3749.

[33]  Adam Tauman Kalai,et al.  Efficient Learning of Generalized Linear and Single Index Models with Isotonic Regression , 2011, NIPS.

[34]  Terence Tao,et al.  The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.

[35]  W. Hoeffding Probability Inequalities for sums of Bounded Random Variables , 1963 .

[36]  Christos Thrampoulidis,et al.  The squared-error of generalized LASSO: A precise analysis , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[37]  Yaniv Plan,et al.  Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach , 2012, IEEE Transactions on Information Theory.

[38]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[39]  Y. Gordon Some inequalities for Gaussian processes and applications , 1985 .

[40]  Martin J. Wainwright,et al.  Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso) , 2009, IEEE Transactions on Information Theory.

[41]  Robert D. Nowak,et al.  Learning Single Index Models in High Dimensions , 2015, ArXiv.

[42]  Joel A. Tropp,et al.  Convex recovery of a structured signal from independent random linear measurements , 2014, ArXiv.

[43]  Tianxi Cai,et al.  L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs , 2015, J. Mach. Learn. Res..

[44]  W. Härdle,et al.  Direct Semiparametric Estimation of Single-Index Models with Discrete Covariates dpsfb950075.ps.tar = Enno MAMMEN J.S. MARRON: Mass Recentered Kernel Smoothers , 1996 .

[45]  Julien Mairal,et al.  Optimization with Sparsity-Inducing Penalties , 2011, Found. Trends Mach. Learn..

[46]  Samet Oymak,et al.  Fast and Reliable Parameter Estimation from Nonlinear Observations , 2016, SIAM J. Optim..

[47]  Arindam Banerjee,et al.  Structured Matrix Recovery via the Generalized Dantzig Selector , 2016, NIPS.