On generalized Csiszár-Kullback inequalities

Classical Csiszar-Kullback inequalities bound the L1-distance of two probability densities in terms of their relative (convex) entropies. Here we generalize such inequalities to not necessarily normalized and possibly non-positive L1 functions. Also, we analyse the optimality of the derived Csiszar-Kullback type inequalities and show that they are in many important cases significantly sharper than the classical ones (in terms of the functional dependence of the L1 bound on the relative entropy). Moreover our construction of these bounds is rather elementary.