Robust Inference Using the Exponential-Polynomial Divergence

Density-based minimum divergence procedures represent popular techniques in parametric statistical inference. They combine strong robustness properties with high (sometimes full) asymptotic efficiency. Among density-based minimum distance procedures, the methods based on the Bregman divergence have the attractive property that the empirical formulation of the divergence does not require the use of any nonparametric smoothing technique such as kernel density estimation. The methods based on the density power divergence (DPD) represent the current standard in this area of research. In this paper, we will present a more generalized divergence which subsumes the DPD as a special case, and produces several new options providing better compromises between robustness and efficiency.

[1]  S. Stigler Do Robust Estimators Work with Real Data , 1977 .

[2]  Abhik Ghosh,et al.  Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach , 2015 .

[3]  V. Yohai,et al.  Robust Statistics: Theory and Methods , 2006 .

[4]  L. Bregman The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming , 1967 .

[5]  Werner A. Stahel,et al.  Robust Statistics: The Approach Based on Influence Functions , 1987 .

[6]  Ayanendranath Basu,et al.  A Characterization of All Single-Integral, Non-Kernel Divergence Estimators , 2019, IEEE Transactions on Information Theory.

[7]  A. Basu,et al.  Robust estimation for independent non-homogeneous observations using density power divergence with applications to linear regression , 2013 .

[8]  D. G. Simpson,et al.  Hellinger Deviance Tests: Efficiency, Breakdown Points, and Examples , 1989 .

[9]  M. C. Jones,et al.  On the ‘optimal’ density power divergence tuning parameter , 2020, Journal of applied statistics.

[10]  M. C. Jones,et al.  Robust and efficient estimation by minimising a density power divergence , 1998 .

[11]  A. Basu,et al.  Statistical Inference: The Minimum Distance Approach , 2011 .

[12]  Wayne Nelson,et al.  Graphical Analysis of Accelerated Life Test Data with the Inverse Power Law Model , 1972 .

[13]  The B-exponential divergence and its generalizations with applications to parametric estimation , 2018, Statistical Methods & Applications.

[14]  Z. Q. Lu Statistical Inference Based on Divergence Measures , 2007 .

[15]  R. Beran Minimum Hellinger distance estimates for parametric models , 1977 .

[16]  Peter J. Rousseeuw,et al.  Robust regression and outlier detection , 1987 .

[17]  M. C. Jones,et al.  Choosing a robustness tuning parameter , 2005 .

[18]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[19]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[20]  William J. Welch,et al.  Rerandomizing the median in matched-pairs designs , 1987 .