Simulation of a Channel With Another Channel

In this paper, we study the problem of simulating a discrete memoryless channel (DMC) from another DMC under an average-case and an exact model. We present several achievability and infeasibility results, with tight characterizations in special cases. In particular, for the exact model, we fully characterize when a binary symmetric channel can be simulated from a binary erasure channel when there is no shared randomness. We also provide infeasibility and achievability results for the simulation of a binary channel from another binary channel in the case of no shared randomness. To do this, we use the properties of Rényi capacity of a given order. We also introduce a notion of “channel diameter” which is shown to be additive and satisfy a data processing inequality.

[1]  Paul W. Cuff,et al.  Hybrid codes needed for coordination over the point-to-point channel , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[2]  Sergio Verdú,et al.  Channel simulation and coding with side information , 1994, IEEE Trans. Inf. Theory.

[3]  Paul W. Cuff,et al.  Communication requirements for generating correlated random variables , 2008, 2008 IEEE International Symposium on Information Theory.

[4]  Zoran Cvetkovic Information Theory and Applications Workshop , 2006 .

[5]  Andreas J. Winter,et al.  The Quantum Reverse Shannon Theorem and Resource Tradeoffs for Simulating Quantum Channels , 2009, IEEE Transactions on Information Theory.

[6]  Paul W. Cuff,et al.  Distributed Channel Synthesis , 2012, IEEE Transactions on Information Theory.

[7]  Amin Gohari,et al.  Channel Simulation via Interactive Communications , 2012, IEEE Transactions on Information Theory.

[8]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[9]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[10]  Abbas El Gamal,et al.  Exact common information , 2014, 2014 IEEE International Symposium on Information Theory.

[11]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.

[12]  Paul Cuff,et al.  Communication in networks for coordinating behavior , 2009 .

[13]  R. Sibson Information radius , 1969 .

[14]  Haim H. Permuter,et al.  Coordination Capacity , 2009, IEEE Transactions on Information Theory.

[15]  Aaron B. Wagner,et al.  Source and Channel Simulation Using Arbitrary Randomness , 2012, IEEE Transactions on Information Theory.

[16]  Hermann J. Helgert A partial ordering of discrete, memoryless channels , 1967, IEEE Trans. Inf. Theory.

[17]  Amin Gohari,et al.  Achievability Proof via Output Statistics of Random Binning , 2012, IEEE Transactions on Information Theory.

[18]  Claude E. Shannon,et al.  A Note on a Partial Ordering for Communication Channels , 1958, Information and Control.

[19]  David L. Neuhoff,et al.  Channel Entropy and Primitive Approximation , 1982 .

[20]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[21]  Debbie W. Leung,et al.  Zero-Error Channel Capacity and Simulation Assisted by Non-Local Correlations , 2010, IEEE Transactions on Information Theory.

[22]  Peter W. Shor,et al.  Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem , 2001, IEEE Trans. Inf. Theory.

[23]  Sidharth Jaggi,et al.  Zero error coordination , 2015, 2015 IEEE Information Theory Workshop - Fall (ITW).

[24]  S. Verdú,et al.  Arimoto channel coding converse and Rényi divergence , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[25]  David L. Neuhoff,et al.  Channels with almost finite memory , 1979, IEEE Trans. Inf. Theory.

[26]  Peter E. Latham,et al.  Mutual Information , 2006 .

[27]  Elchanan Mossel,et al.  On Extracting Common Random Bits From Correlated Sources , 2010, IEEE Transactions on Information Theory.