ON DIMENSION REDUCTION IN REGRESSIONS WITH MULTIVARIATE RESPONSES

This paper is concerned with dimension reduction in regressions with multivariate responses on high-dimensional predictors. A unified method that can be regarded as either an inverse regression approach or a forward regression method is proposed to recover the central dimension reduction subspace. By using Stein's Lemma, the forward regression estimates the first derivative of the conditional characteristic function of the response given the predictors; by using the Fourier method, the inverse regression estimates the subspace spanned by the conditional mean of the predictors given the responses. Both methods lead to an identical kernel matrix, while preserving as much regression information as possible. Illustrative examples of a data set and comprehensive simulations are used to demonstrate the application of our methods.

[1]  Ker-Chau Li,et al.  Sliced Inverse Regression for Dimension Reduction , 1991 .

[2]  Tailen Hsing,et al.  Nearest neighbor inverse regression , 1999 .

[3]  Efstathia Bura,et al.  Moment-based dimension reduction for multivariate response regression , 2006 .

[4]  H. Zha,et al.  Contour regression: A general approach to dimension reduction , 2005, math/0508277.

[5]  Ker-Chau Li,et al.  On almost Linearity of Low Dimensional Projections from High Dimensional Data , 1993 .

[6]  T. Stukel,et al.  Determinants of plasma levels of beta-carotene and retinol. Skin Cancer Prevention Study Group. , 1989, American journal of epidemiology.

[7]  L. Ferré Determining the Dimension in Sliced Inverse Regression and Related Methods , 1998 .

[8]  Yu Zhu,et al.  Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression , 2006 .

[9]  D. Freedman,et al.  Asymptotics of Graphical Projection Pursuit , 1984 .

[10]  Yingxing Li,et al.  On hybrid methods of inverse regression-based algorithms , 2007, Comput. Stat. Data Anal..

[11]  Ker-Chau Li Sliced inverse regression for dimension reduction (with discussion) , 1991 .

[12]  R. Dennis Cook,et al.  K-Means Inverse Regression , 2004, Technometrics.

[13]  Zhou Yu,et al.  On spline approximation of sliced inverse regression , 2007 .

[14]  Wolfgang Härdle,et al.  Sliced inverse regression for dimension reduction. Comments. Reply , 1991 .

[15]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[16]  Shaoli Wang,et al.  On Directional Regression for Dimension Reduction , 2007 .

[17]  B. Li,et al.  On a Projective Resampling Method for Dimension Reduction With Multivariate Responses , 2008 .

[18]  R. Cook,et al.  Dimension reduction for conditional mean in regression , 2002 .

[19]  R. Cook,et al.  Identifying Regression Outliers and Mixtures Graphically , 2000 .

[20]  Kerby Shedden,et al.  Dimension Reduction for Multivariate Response Data , 2003 .

[21]  K. Fang,et al.  Asymptotics for kernel estimate of sliced inverse regression , 1996 .

[22]  R. Weiss,et al.  Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods , 2003 .

[23]  C. Stein Estimation of the Mean of a Multivariate Normal Distribution , 1981 .

[24]  R. Dennis Cook,et al.  A Model-Free Test for Reduced Rank in Multivariate Regression , 2003 .

[25]  R. Cook,et al.  Sufficient Dimension Reduction via Inverse Regression , 2005 .

[26]  Jérôme Saracco,et al.  Asymptotics for pooled marginal slicing estimator based on SIRα approach , 2005 .

[27]  Liping Zhu,et al.  On kernel method for sliced average variance estimation , 2007 .

[28]  Lixing Zhu,et al.  Asymptotics for sliced average variance estimation , 2007, 0708.0462.