The Augustin Capacity and Center

For any channel, the existence of a unique Augustin mean is established for any positive order and probability mass function on the input set. The Augustin mean is shown to be the unique fixed point of an operator defined in terms of the order and the input distribution. The Augustin information is shown to be continuously differentiable in the order. For any channel and convex constraint set with finite Augustin capacity, the existence of a unique Augustin center and associated van Erven-Harremoës bound are established. The Augustin-Legendre (A-L) information, capacity, center, and radius are introduced and the latter three are proved to be equal to the corresponding Rényi-Gallager quantities. The equality of the A-L capacity to the A-L radius for arbitrary channels and the existence of a unique A-L center for channels with finite A-L capacity are established. For all interior points of the feasible set of cost constraints, the cost constrained Augustin capacity and center are expressed in terms the A-L capacity and center. Certain shift invariant families of probabilities and certain Gaussian channels are analyzed as examples.

[1]  Harold R. Parks,et al.  A Primer of Real Analytic Functions , 1992 .

[2]  Gustavo L. Gilardoni On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$ -Divergences , 2006, IEEE Transactions on Information Theory.

[3]  Dudley,et al.  Real Analysis and Probability: Measurability: Borel Isomorphism and Analytic Sets , 2002 .

[4]  Sergio Verdú,et al.  Channels with cost constraints: Strong converse and dispersion , 2013, ISIT.

[5]  Marco Dalai,et al.  Constant Compositions in the Sphere Packing Bound for Classical-Quantum Channels , 2014, IEEE Transactions on Information Theory.

[6]  J. Kemperman On the Shannon capacity of an arbitrary channel , 1974 .

[7]  S. Verdú,et al.  Arimoto channel coding converse and Rényi divergence , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[8]  Sergio Verdú α-mutual information , 2015, 2015 Information Theory and Applications Workshop (ITA).

[9]  Gonzalo Vazquez-Vilar,et al.  A derivation of the cost-constrained sphere-packing exponent , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[10]  Ofer Shayevitz,et al.  On Rényi measures and hypothesis testing , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[11]  W. Rudin Principles of mathematical analysis , 1964 .

[12]  R. Gallager Information Theory and Reliable Communication , 1968 .

[13]  I. Csiszár A class of measures of informativity of observation channels , 1972 .

[14]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.

[15]  Bans Nakiboglu The Augustin center and the sphere packing bound for memoryless channels , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[17]  Barış Nakiboğlu,et al.  The Rényi Capacity and Center , 2016, IEEE Transactions on Information Theory.

[18]  M. Dalai Classical and Classical-Quantum Sphere Packing Bounds: Rényi vs Kullback and Leibler , 2016 .

[19]  Richard E. Blahut,et al.  Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.

[20]  Yasutada Oohama Exponent Function for Stationary Memoryless Channels with Input Cost at Rates above the Capacity , 2017, ArXiv.

[21]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[22]  Yasutada Oohama The optimal exponent function for the additive white Gaussian noise channel at rates above the capacity , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[23]  Dimitri P. Bertsekas,et al.  Convex Analysis and Optimization , 2003 .

[24]  Bariş Nakiboğlu,et al.  The Sphere Packing Bound via Augustin’s Method , 2016, IEEE Transactions on Information Theory.

[25]  Marco Dalai,et al.  Some Remarks on Classical and Classical-Quantum Sphere Packing Bounds: Rényi vs. Kullback-Leibler , 2017, Entropy.

[26]  Paul M. Ebert,et al.  Error bounds for parallel communication channels. , 1966 .

[27]  Ofer Shayevitz,et al.  A Note on a Characterization of Rényi Measures and its Relation to Composite Hypothesis Testing , 2010, ArXiv.

[28]  John S. Richters,et al.  Communication over fading dispersive channels. , 1967 .

[29]  Barış Nakiboğlu,et al.  The Sphere Packing Bound For Memoryless Channels , 2018, Probl. Inf. Transm..

[30]  R. Sibson Information radius , 1969 .

[31]  A. Rényi On Measures of Entropy and Information , 1961 .

[32]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[33]  Suguru Arimoto Computation of random coding exponent functions , 1976, IEEE Trans. Inf. Theory.

[34]  H. Komiya Elementary proof for Sion's minimax theorem , 1988 .

[35]  R. A. Silverman,et al.  Introductory Real Analysis , 1972 .

[36]  I. Csiszár Generalized Cutoff Rates and Renyi's Information Measures , 1993, Proceedings. IEEE International Symposium on Information Theory.

[37]  M. Sion On general minimax theorems , 1958 .