Random-coding bounds for threshold decoders: Error exponent and saddlepoint approximation

This paper considers random-coding bounds to the decoding error probability with threshold decoders. A slightly improved version of the dependence-testing bound is derived. A loosening of this bound generates a family of Feinstein-like bounds, which improve on Feinstein's original version. The error exponents of these bounds are determined and simple, yet accurate, saddlepoint approximations to the corresponding error probabilities are derived.

[1]  Amiel Feinstein,et al.  A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.

[2]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[3]  Albert Guillén i Fàbregas,et al.  Saddlepoint approximation of random-coding bounds , 2011, 2011 Information Theory and Applications Workshop.

[4]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[5]  Claude E. Shannon,et al.  Certain Results in Coding Theory for Noisy Channels , 1957, Inf. Control..

[6]  H. Vincent Poor,et al.  Channel coding: non-asymptotic fundamental limits , 2010 .

[7]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[8]  Juan Carlos Abril,et al.  Saddlepoint Approximations , 2011, International Encyclopedia of Statistical Science.

[9]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .