Multicollinearity has been a serious problem in regression analysis, Ordinary Least Squares (OLS) regression may result in high variability in the estimates of the regression coefficients in the presence of multicollinearity. Least Absolute Shrinkage and Selection Operator (LASSO) methods is a well established method that reduces the variability of the estimates by shrinking the coefficients and at the same time produces interpretable models by shrinking some coefficients to exactly zero. We present the performance of LASSO -type estimators in the presence of multicollinearity using Monte Carlo approach. The performance of LASSO, Adaptive LASSO, Elastic Net, Fused LASSO and Ridge Regression (RR) in the presence of multicollinearity in simulated data sets are compared Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) criteria. A Monte Carlo experiment of 1000 trials was carried out at different sample sizes n (50, 100 and 150) with different levels of multicollinearity among the exogenous variables (ρ = 0.3, 0.6, and 0.9). The overall performance of Lasso appears to be the best but Elastic net tends to be more accurate when the sample size is large.
[1]
A. E. Hoerl,et al.
Ridge Regression: Applications to Nonorthogonal Problems
,
1970
.
[2]
J. Friedman,et al.
A Statistical View of Some Chemometrics Regression Tools
,
1993
.
[3]
L. Gleser.
Measurement, Regression, and Calibration
,
1996
.
[4]
R. Tibshirani.
Regression Shrinkage and Selection via the Lasso
,
1996
.
[5]
A. E. Hoerl,et al.
Ridge regression: biased estimation for nonorthogonal problems
,
2000
.
[6]
R. Tibshirani,et al.
Sparsity and smoothness via the fused lasso
,
2005
.
[7]
H. Zou,et al.
Regularization and variable selection via the elastic net
,
2005
.
[8]
H. Zou.
The Adaptive Lasso and Its Oracle Properties
,
2006
.
[9]
M. Yuan,et al.
Model selection and estimation in regression with grouped variables
,
2006
.
[10]
Liangjun Su,et al.
Shrinkage Estimation of Dynamic Panel Data Models with Interactive Fixed Effects
,
2015
.