Distributed Channel Synthesis

Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This paper characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this paper also generalizes and strengthens this soft covering tool.

[1]  Jörg Kliewer,et al.  On secure communication with constrained randomization , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[2]  Imre Csiszár,et al.  Common randomness and secret key generation with a helper , 2000, IEEE Trans. Inf. Theory.

[3]  Paul W. Cuff State Information in Bayesian Games , 2009, ArXiv.

[4]  Matthieu R. Bloch,et al.  Strong Secrecy From Channel Resolvability , 2011, IEEE Transactions on Information Theory.

[5]  U. Maurer,et al.  Secret key agreement by public discussion from common information , 1993, IEEE Trans. Inf. Theory.

[6]  R. Renner,et al.  The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory , 2009, 0912.3805.

[7]  Andreas J. Winter,et al.  The Quantum Reverse Shannon Theorem and Resource Tradeoffs for Simulating Quantum Channels , 2009, IEEE Transactions on Information Theory.

[8]  Sergio Verdú,et al.  Channel simulation and coding with side information , 1994, IEEE Trans. Inf. Theory.

[9]  Haim H. Permuter,et al.  Coordination Capacity , 2009, IEEE Transactions on Information Theory.

[10]  Paul Cuff,et al.  Secure Cascade Channel Synthesis , 2013, IEEE Transactions on Information Theory.

[11]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[12]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[13]  P. Narayan,et al.  Common randomness and secret key generation with a helper , 1997, Proceedings of IEEE International Symposium on Information Theory.

[14]  Paul W. Cuff A Framework for Partial Secrecy , 2010, 2010 IEEE Global Telecommunications Conference GLOBECOM 2010.

[15]  Mark M. Wilde,et al.  From Classical to Quantum Shannon Theory , 2011, ArXiv.

[16]  Vivek S. Borkar,et al.  Common randomness and distributed control: A counterexample , 2007, Systems & control letters (Print).

[17]  Matthieu R. Bloch,et al.  Secrecy from Resolvability , 2011, ArXiv.

[18]  Masahito Hayashi,et al.  General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel , 2006, IEEE Transactions on Information Theory.

[19]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .

[20]  Ashish V. Thapliyal,et al.  Entanglement-Assisted Classical Capacity of Noisy Quantum Channels , 1999, Physical Review Letters.

[21]  Paul W. Cuff,et al.  Communication requirements for generating correlated random variables , 2008, 2008 IEEE International Symposium on Information Theory.

[22]  Imre Csiszár Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.

[23]  Amin Gohari,et al.  Channel Simulation via Interactive Communications , 2012, IEEE Transactions on Information Theory.

[24]  Paul W. Cuff,et al.  A connection between good rate-distortion codes and backward DMCs , 2013, 2013 IEEE Information Theory Workshop (ITW).

[25]  Venkat Anantharam,et al.  Generating dependent random variables over networks , 2011, 2011 IEEE Information Theory Workshop.

[26]  A. Winter Compression of sources of probability distributions and density operators , 2002, quant-ph/0208131.

[27]  C. Carathéodory Über den variabilitätsbereich der fourier’schen konstanten von positiven harmonischen funktionen , 1911 .

[28]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[29]  Amin Gohari,et al.  Coordination via a relay , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[30]  Paul W. Cuff,et al.  Secrecy is cheap if the adversary must reconstruct , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[31]  Tuomas Sandholm,et al.  Solving two-person zero-sum repeated games of incomplete information , 2008, AAMAS.

[32]  Mark M. Wilde,et al.  The information-theoretic costs of simulating quantum measurements , 2012, ArXiv.

[33]  Debbie W. Leung,et al.  Zero-Error Channel Capacity and Simulation Assisted by Non-Local Correlations , 2010, IEEE Transactions on Information Theory.

[34]  Ueli Maurer,et al.  Information-Theoretic Key Agreement: From Weak to Strong Secrecy for Free , 2000, EUROCRYPT.

[35]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[36]  Paul Cuff,et al.  Communication in networks for coordinating behavior , 2009 .

[37]  Amin Gohari,et al.  Achievability Proof via Output Statistics of Random Binning , 2012, IEEE Transactions on Information Theory.

[38]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[39]  Peter W. Shor,et al.  Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem , 2001, IEEE Trans. Inf. Theory.

[40]  Rudolf Ahlswede,et al.  Common randomness in information theory and cryptography - I: Secret sharing , 1993, IEEE Trans. Inf. Theory.

[41]  H. Witsenhausen Values and Bounds for the Common Information of Two Discrete Random Variables , 1976 .

[42]  Emina Soljanin,et al.  Compressing quantum mixed-state sources by sending classical information , 2002, IEEE Trans. Inf. Theory.

[43]  Andreas J. Winter,et al.  Secret, public and quantum correlation cost of triples of random variables , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[44]  Rudolf Ahlswede,et al.  Strong converse for identification via quantum channels , 2000, IEEE Trans. Inf. Theory.

[45]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[46]  Jaikumar Radhakrishnan,et al.  The Communication Complexity of Correlation , 2007, Twenty-Second Annual IEEE Conference on Computational Complexity (CCC'07).

[47]  Rudolf Ahlswede,et al.  Common Randomness in Information Theory and Cryptography - Part II: CR Capacity , 1998, IEEE Trans. Inf. Theory.

[48]  L. Goddard Information Theory , 1962, Nature.

[49]  Thomas M. Cover,et al.  Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing) , 2006 .

[50]  Sergio Verdú,et al.  Simulation of random processes and rate-distortion theory , 1996, IEEE Trans. Inf. Theory.

[51]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[52]  E. Steinitz Bedingt konvergente Reihen und konvexe Systeme. , 1913 .

[53]  Andreas J. Winter,et al.  Quantum Reverse Shannon Theorem , 2009, ArXiv.

[54]  Paul W. Cuff Using a secret key to foil an eavesdropper , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[55]  A. Winter ‘‘Extrinsic’’ and ‘‘Intrinsic’’ Data in Quantum Measurements: Asymptotic Convex Decomposition of Positive Operator Valued Measures , 2001, quant-ph/0109050.