Tight Bounds for Communication-Assisted Agreement Distillation
暂无分享,去创建一个
[1] Elza Erkip,et al. The Efficiency of Investment Information , 1998, IEEE Trans. Inf. Theory.
[2] William Feller,et al. An Introduction to Probability Theory and Its Applications , 1951 .
[3] Konstantin Makarychev,et al. Chain Independence and Common Information , 2012, IEEE Transactions on Information Theory.
[4] Venkat Anantharam,et al. On Maximal Correlation, Hypercontractivity, and the Data Processing Inequality studied by Erkip and Cover , 2013, ArXiv.
[5] Alex Samorodnitsky. The "Most informative boolean function" conjecture holds for high noise , 2015, ArXiv.
[6] Elchanan Mossel,et al. Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality , 2004, math/0410560.
[7] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.
[8] Lei Zhao,et al. The efficiency of common randomness generation , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[9] Alex Samorodnitsky,et al. On the Entropy of a Noisy Function , 2015, IEEE Transactions on Information Theory.
[10] V. Chandar,et al. Most Informative Quantization Functions , 2014 .
[11] Martin Bossert,et al. Canalizing Boolean Functions Maximize Mutual Information , 2012, IEEE Transactions on Information Theory.
[12] Ryan O'Donnell,et al. Remarks on the Most Informative Function Conjecture at fixed mean , 2015 .
[13] Thomas A. Courtade,et al. Which Boolean Functions Maximize Mutual Information on Noisy Inputs? , 2014, IEEE Transactions on Information Theory.
[14] Elchanan Mossel,et al. On Extracting Common Random Bits From Correlated Sources , 2011, IEEE Transactions on Information Theory.
[15] Ke Yang,et al. On the (im)possibility of non-interactive correlation distillation , 2004, Theor. Comput. Sci..
[16] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.
[17] Ryan O'Donnell,et al. Coin flipping from a cosmic source: On error correction of truly random bits , 2004, Random Struct. Algorithms.
[18] Chandra Nair,et al. Evaluating hypercontractivity parameters using information measures , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[19] Rudolf Ahlswede,et al. Common Randomness in Information Theory and Cryptography - Part II: CR Capacity , 1998, IEEE Trans. Inf. Theory.
[20] Ryan O'Donnell,et al. Analysis of Boolean Functions , 2014, ArXiv.
[21] Venkat Anantharam,et al. On hypercontractivity and a data processing inequality , 2014, 2014 IEEE International Symposium on Information Theory.
[22] Venkat Anantharam,et al. On hypercontractivity and the mutual information between Boolean functions , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[23] Omri Weinstein,et al. Dictatorship is the Most Informative Balanced Function at the Extremes , 2015, Electron. Colloquium Comput. Complex..