Resolvability on Continuous Alphabets

We characterize the resolvability region for a large class of point-to-point channels with continuous alphabets. In our direct result, we prove not only the existence of good resolvability codebooks, but adapt an approach based on the Chernoff-Hoeffding bound to the continuous case showing that the probability of drawing an unsuitable codebook is doubly exponentially small. For the converse part, we show that our previous elementary result carries over to the continuous case easily under some mild continuity assumption.

[1]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[2]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[3]  Masahito Hayashi,et al.  Strong converse and second-order asymptotics of channel resolvability , 2014, 2014 IEEE International Symposium on Information Theory.

[4]  Edward C. Posner,et al.  Random coding strategies for minimum entropy , 1975, IEEE Trans. Inf. Theory.

[5]  J. Lynch,et al.  A weak convergence approach to the theory of large deviations , 1997 .

[6]  Slawomir Stanczak,et al.  The MAC Resolvability Region, Semantic Security and Its Operational Implications , 2017, ArXiv.

[7]  Masahito Hayashi,et al.  General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel , 2006, IEEE Transactions on Information Theory.

[8]  Matthieu R. Bloch,et al.  Strong Secrecy From Channel Resolvability , 2011, IEEE Transactions on Information Theory.

[9]  Hideki Yagi Channel resolvability theorems for general sources and channels , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[10]  J. Norris Appendix: probability and measure , 1997 .

[11]  Yossef Steinberg Resolvability Theory for the Multiple-Access Channel , 1998, IEEE Trans. Inf. Theory.

[12]  Alessandro Panconesi,et al.  Concentration of Measure for the Analysis of Randomized Algorithms , 2009 .

[13]  Vincent Y. F. Tan,et al.  Rényi Resolvability and Its Applications to the Wiretap Channel , 2017, IEEE Transactions on Information Theory.

[14]  E. Çinlar Probability and Stochastics , 2011 .

[15]  Ute Dreher,et al.  Measure And Integration Theory , 2016 .

[16]  Paul W. Cuff,et al.  Soft covering with high probability , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[17]  Slawomir Stanczak,et al.  MAC resolvability: First and second order results , 2017, 2017 IEEE Conference on Communications and Network Security (CNS).

[18]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[19]  Jie Hou,et al.  Coding for Relay Networks and Effective Secrecy for Wire-tap Channels , 2014 .

[20]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.

[21]  C. McDiarmid Concentration , 1862, The Dental register.

[22]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[23]  Vincent Y. F. Tan,et al.  Rényi Resolvability and Its Applications to the Wiretap Channel , 2019, IEEE Trans. Inf. Theory.

[24]  Gerhard Kramer,et al.  Informational divergence approximations to product distributions , 2013, 2013 13th Canadian Workshop on Information Theory.

[25]  Igor Devetak The private classical capacity and quantum capacity of a quantum channel , 2005, IEEE Transactions on Information Theory.