On robustness and efficiency of minimum divergence estimators

We study the trade-off between efficiency and robustness of the estimators obtained by minimizing the divergence statistics and their adjoints, obtained by minimizing the asymmetric counterparts of the divergence statistics. In particular, it is shown that no minimum power-divergence estimator is better than the minimum Hellinger distance estimator in terms of both second-order efficiency and robustness.