How many queries will resolve common randomness?

A set of m terminals, observing correlated signals, communicate interactively to generate common randomness for a given subset of them. Knowing only the communication, how many direct queries of the value of the common randomness will resolve it? A general upper bound, valid for arbitrary signal alphabets, is developed for the number of such queries by using a query strategy that applies to all common randomness and associated communication. When the underlying signals are independent and identically distributed repetitions of m correlated random variables, the number of queries can be exponential in signal length. For this case, the mentioned upper bound is tight and leads to a single-letter formula for the largest query exponent, which coincides with the secret key capacity of a corresponding multiterminal source model. In fact, the upper bound constitutes a strong converse for the optimum query exponent, and implies also a new strong converse for secret key capacity. A key tool, estimating the size of a large probability set in terms of Rényi entropy, is interpreted separately, too, as a lossless block coding result for general sources. As a particularization, it yields the classic result for a discrete memoryless source.

[1]  Imre Csiszár,et al.  Secrecy Capacities for Multiterminal Channel Models , 2005, IEEE Transactions on Information Theory.

[2]  Imre Csiszár,et al.  Secrecy capacities for multiple terminals , 2004, IEEE Transactions on Information Theory.

[3]  Chung Chan,et al.  Generating secret in a network , 2010 .

[4]  Erdal Arikan An inequality on guessing and its application to sequential decoding , 1996, IEEE Trans. Inf. Theory.

[5]  Rajesh Sundaresan,et al.  The Shannon Cipher System With a Guessing Wiretapper: General Sources , 2011, IEEE Transactions on Information Theory.

[6]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[7]  Chung Chan,et al.  Mutual dependence for secret key agreement , 2010, 2010 44th Annual Conference on Information Sciences and Systems (CISS).

[8]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[9]  Mokshay M. Madiman,et al.  Information Inequalities for Joint Distributions, With Interpretations and Applications , 2008, IEEE Transactions on Information Theory.

[10]  Mokshay M. Madiman,et al.  Entropy and set cardinality inequalities for partition‐determined functions , 2008, Random Struct. Algorithms.

[11]  Rudolf Ahlswede,et al.  Common randomness in information theory and cryptography - I: Secret sharing , 1993, IEEE Trans. Inf. Theory.

[12]  G. Crooks On Measures of Entropy and Information , 2015 .

[13]  J. Massey Guessing and entropy , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[14]  Thomas M. Cover,et al.  Gaussian feedback capacity , 1989, IEEE Trans. Inf. Theory.

[15]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[16]  Dudley,et al.  Real Analysis and Probability: Measurability: Borel Isomorphism and Analytic Sets , 2002 .

[17]  Neri Merhav,et al.  The Shannon cipher system with a guessing wiretapper , 1999, IEEE Trans. Inf. Theory.

[18]  S. Bobkov,et al.  Concentration of the information in data with log-concave distributions , 2010, 1012.5457.

[19]  Sirin Nitinawarat Secret key generation for correlated Gaussian sources , 2008, 2008 IEEE International Symposium on Information Theory.

[20]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[21]  Tetsunao Matsuta,et al.  国際会議開催報告:2013 IEEE International Symposium on Information Theory , 2013 .

[22]  Venkat Anantharam,et al.  The Common Randomness Capacity of a Pair of Independent Discrete Memoryless Channels , 1998, IEEE Trans. Inf. Theory.