LEAST SQUARES VERSUS MINIMUM ABSOLUTE DEVIATIONS ESTIMATION IN LINEAR MODELS

Previous research has indicated that minimum absolute deviations (MAD) estimators tend to be more efficient than ordinary least squares (OLS) estimators in the presence of large disturbances. Via Monte Carlo sampling this study investigates cases in which disturbances are normally distributed with constant variance except for one or more outliers whose disturbances are taken from a normal distribution with a much larger variance. It is found that MAD estimation retains its advantage over OLS through a wide range of conditions, including variations in outlier variance, number of regressors, number of observations, design matrix configuration, and number of outliers. When no outliers are present, the efficiency of MAD estimators relative to OLS exhibits remarkably slight variation.