Regresion analysis with multicollinear predictor variables: definition, derection, and effects

For over fifty years researchers have encountered difficulties with least squares estimators when predictor variables in a regression analysis are multicollinear. Extensive research efforts over the last ten to fifteen years have resulted in a clear understanding of many aspects of this problem and have, generated a great deal of controversy over possible solutiors. In this survey the nature and effects of predictor-variable multicollinearities are examined. Emphasis is placed on discussions of the multicollinearity problem itself rather than on classical or Bayesian solutions to the problem.

[1]  Gary M. Mullet,et al.  Why Regression Coefficients Have the Wrong Sign , 1976 .

[2]  T. Fomby,et al.  An Optimal Property of Principal Components in the Context of Restricted Least Squares , 1978 .

[3]  Donald W. Marquaridt Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Estimation , 1970 .

[4]  R. Welsch,et al.  Efficient Bounded-Influence Regression Estimation , 1982 .

[5]  S. S. Srivastava,et al.  Correlation in Polynomial Regression , 1979 .

[6]  R F Gunst,et al.  Advantages of examining multicollinearities in regression analysis. , 1977, Biometrics.

[7]  S. R. Searle Linear Models , 1971 .

[8]  G. C. McDonald,et al.  Instabilities of Regression Estimates Relating Air Pollution to Mortality , 1973 .

[9]  S. Silvey Multicollinearity and Imprecise Estimation , 1969 .

[10]  R. R. Hocking The analysis and selection of variables in linear regression , 1976 .

[11]  N. Draper,et al.  Applied Regression Analysis , 1966 .

[12]  R. Snee,et al.  Ridge Regression in Practice , 1975 .

[13]  Mark Z. Fabrycy Multicollinearity Caused by Specification Errors , 1975 .

[14]  Arthur E. Hoerl,et al.  Application of ridge analysis to regression problems , 1962 .

[15]  R. Iman,et al.  The Use of the Rank Transform in Regression , 1979 .

[16]  Norman R. Draper,et al.  Ridge Regression and James-Stein Estimation: Review and Comments , 1979 .

[17]  G. C. Tiao,et al.  Bayesian inference in statistical analysis , 1973 .

[18]  H. V. Henderson,et al.  Building Multiple Regression Models Interactively , 1981 .

[19]  D. Lindley,et al.  Bayes Estimates for the Linear Model , 1972 .

[20]  J. T. Webster,et al.  Regression analysis and problems of multicollinearity , 1975 .

[21]  J. Cornell Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data , 1982 .

[22]  D. Montgomery,et al.  Augmented Robust Estimators , 1980 .

[23]  E. Greenberg Minimum Variance Properties of Principal Component Regression , 1975 .

[24]  D. Ruppert,et al.  Trimmed Least Squares Estimation in the Linear Model , 1980 .

[25]  G. C. Tiao,et al.  Bayes's theorem and the use of prior knowledge in regression analysis , 1964 .

[26]  Thomas P. Hettmansperger,et al.  A Robust Alternative Based on Ranks to Least Squares in Analyzing Linear Models , 1977 .

[27]  Edward Leamer Multicollinearity: A Bayesian Interpretation , 1973 .

[28]  Andrew Harvey,et al.  Some Comments on Multicollinearity in Regression , 1977 .

[29]  Robert L. Mason,et al.  Regression Analysis and Its Application: A Data-Oriented Approach. , 1982 .

[30]  D. F. Andrews,et al.  A Robust Method for Multiple Linear Regression , 1974 .

[31]  Donald G. Watts,et al.  Meaningful Multicollinearity Measures , 1978 .

[32]  Gary Smith,et al.  A Critique of Some Ridge Regression Methods , 1980 .