Generalized source coding theorems and hypothesis testing: Part I — Information measures

Abstract Expressions for ϵ‐entropy rate, ϵ‐mutual information rate and ϵ‐divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/sup‐entropy/information/divergence rates of Han and Verdu. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the e‐capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman‐Pearson hypothesis testing type‐II error exponent subject to upper bounds on the type‐I error probability.