A link-free method for testing the significance of predictors

One important step in regression analysis is to identify significant predictors from a pool of candidates so that a parsimonious model can be obtained using these significant predictors only. However, most of the existing methods assume linear relationships between response and predictors, which may be inappropriate in some applications. In this article, we discuss a link-free method that avoids specifying how the response depends on the predictors. Therefore, this method has no problem of model misspecification, and it is suitable for selecting significant predictors at the preliminary stage of data analysis. A test statistic is suggested and its asymptotic distribution is derived. Examples are used to demonstrate the proposed method.

[1]  C. Nachtsheim,et al.  Model‐free variable selection , 2005 .

[2]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[3]  Ping Zhang Variable Selection in Nonparametric Regression with Continuous Covariates , 1991 .

[4]  Ker-Chau Li,et al.  Sliced Inverse Regression for Dimension Reduction , 1991 .

[5]  Jianqing Fan Test of Significance Based on Wavelet Thresholding and Neyman's Truncation , 1996 .

[6]  Peng Zeng,et al.  An integral transform method for estimating the central mean and central subspaces , 2010, J. Multivar. Anal..

[7]  R. Cook,et al.  Reweighting to Achieve Elliptically Contoured Covariates in Regression , 1994 .

[8]  Trevor Hastie,et al.  Multivariate adaptive regression splines. Discussions , 1991 .

[9]  Yingcun Xia,et al.  Variable selection for the single‐index model , 2007 .

[10]  Meta M. Voelker,et al.  Variable Selection and Model Building via Likelihood Basis Pursuit , 2004 .

[11]  F. E. Satterthwaite Synthesis of variance , 1941 .

[12]  Thomas M. Stoker,et al.  Investigating Smooth Multiple Regression by the Method of Average Derivatives , 2015 .

[13]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[14]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[15]  F. Götze,et al.  The Edgeworth Expansion for $U$-Statistics of Degree Two , 1986 .

[16]  A. Atkinson Subset Selection in Regression , 1992 .

[17]  Lexin Li,et al.  Sparse sufficient dimension reduction , 2007 .

[18]  J. Friedman Multivariate adaptive regression splines , 1990 .

[19]  G. Casella,et al.  Objective Bayesian Variable Selection , 2006 .

[20]  R. Dennis Cook,et al.  Testing predictor contributions in sufficient dimension reduction , 2004, math/0406520.

[21]  Wolfgang Härdle,et al.  Search for significant variables in nonparametric additive regression , 1996 .

[22]  S. Csörgo Testing for independence by the empirical characteristic function , 1985 .

[23]  E. George,et al.  Journal of the American Statistical Association is currently published by American Statistical Association. , 2007 .

[24]  Alan J. Lee,et al.  U-Statistics: Theory and Practice , 1990 .

[25]  Ker-Chau Li Sliced inverse regression for dimension reduction (with discussion) , 1991 .

[26]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[27]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[28]  G. Folland Fourier analysis and its applications , 1992 .

[29]  Yu Zhu,et al.  Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression , 2006 .

[30]  Yongmiao Hong,et al.  Hypothesis Testing in Time Series via the Empirical Characteristic Function: A Generalized Spectral Density Approach , 1999 .