Sharp optimality for density deconvolution with dominating bias

We consider estimation of the common probability density $f$ of i.i.d. random variables $X_i$ that are observed with an additive i.i.d. noise. We assume that the unknown density $f$ belongs to a class $\mathcal{A}$ of densities whose characteristic function is described by the exponent $\exp(-\alpha |u|^r)$ as $|u|\to \infty$, where $\alpha >0$, $r>0$. The noise density is supposed to be known and such that its characteristic function decays as $\exp(-\beta |u|^s)$, as $|u| \to \infty$, where $\beta >0$, $s>0$. Assuming that $r<s$, we suggest a kernel type estimator that is optimal in sharp asymptotical minimax sense on $\mathcal{A}$ simultaneously under the pointwise and the $\mathbb{L}_2$-risks. The variance of the estimators turns out to be asymptotically negligible w.r.t. its squared bias. For $r<s/2$ we construct a sharp adaptive estimator of $f$. We discuss some effects of dominating bias, such as superefficiency of minimax estimators.