On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection
暂无分享,去创建一个
[1] Yuval Kochman,et al. On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection , 2019, IEEE Transactions on Information Theory.
[2] Gregory W. Wornell,et al. Communication Under Strong Asynchronism , 2007, IEEE Transactions on Information Theory.
[3] Neri Merhav,et al. Optimum Tradeoffs Between the Error Exponent and the Excess-Rate Exponent of Variable-Rate Slepian–Wolf Coding , 2015, IEEE Transactions on Information Theory.
[4] S. Amari,et al. Error bound of hypothesis testing with data compression , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[5] W. Rudin. Principles of mathematical analysis , 1964 .
[6] Michèle Wigger,et al. On Hypothesis Testing Against Conditional Independence With Multiple Decision Centers , 2018, IEEE Transactions on Communications.
[7] Katalin Marton,et al. Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.
[8] Yu Xiang,et al. Interactive hypothesis testing against independence , 2013, 2013 IEEE International Symposium on Information Theory.
[9] Richard E. Blahut,et al. Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.
[10] Mérouane Debbah,et al. On the necessity of binning for the distributed hypothesis testing problem , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[11] Te Han,et al. Hypothesis testing with multiterminal data compression , 1987, IEEE Trans. Inf. Theory.
[12] Achilleas Anastasopoulos,et al. Error Exponent for Multiple Access Channels: Upper Bounds , 2015, IEEE Transactions on Information Theory.
[13] Imre Csiszár,et al. Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.
[14] Stephen P. Boyd,et al. Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.
[15] P. Gács,et al. Bounds on conditional probabilities with applications in multi-user communication , 1976 .
[16] Suguru Arimoto. Computation of random coding exponent functions , 1976, IEEE Trans. Inf. Theory.
[17] Neri Merhav,et al. Achievable Error Exponents for the Private Fingerprinting Game , 2007, IEEE Transactions on Information Theory.
[18] W. Hoeffding. Asymptotically Optimal Tests for Multinomial Distributions , 1965 .
[19] Thomas M. Cover,et al. Network Information Theory , 2001 .
[20] Mérouane Debbah,et al. Distributed Binary Detection With Lossy Data Compression , 2016, IEEE Transactions on Information Theory.
[21] Eli Haim,et al. On Binary Distributed Hypothesis Testing , 2017, ArXiv.
[22] Yury Polyanskiy,et al. Hypothesis testing via a comparator , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.
[23] Shun-ichi Amari,et al. Statistical Inference Under Multiterminal Data Compression , 1998, IEEE Trans. Inf. Theory.
[24] Ertem Tuncel,et al. On error exponents in hypothesis testing , 2005, IEEE Transactions on Information Theory.
[25] Thomas M. Cover,et al. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing) , 2006 .
[26] Rudolf Ahlswede,et al. Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.
[27] Michel Loève,et al. Probability Theory I , 1977 .
[28] Chao Tian,et al. Successive Refinement for Hypothesis Testing and Lossless One-Helper Problem , 2008, IEEE Transactions on Information Theory.
[29] Eli Haim,et al. Binary distributed hypothesis testing via Körner-Marton coding , 2016, 2016 IEEE Information Theory Workshop (ITW).
[30] A. Rényi. On Measures of Entropy and Information , 1961 .
[31] Aaron B. Wagner,et al. Optimality of binning for distributed hypothesis testing , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[32] Jack K. Wolf,et al. Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.
[33] Mérouane Debbah,et al. Collaborative Distributed Hypothesis Testing , 2016, ArXiv.
[34] H. Vincent Poor,et al. An Introduction to Signal Detection and Estimation , 1994, Springer Texts in Electrical Engineering.
[35] H. Vincent Poor,et al. An introduction to signal detection and estimation (2nd ed.) , 1994 .
[36] Patrick P. Bergmans,et al. Random coding theorem for broadcast channels with degraded components , 1973, IEEE Trans. Inf. Theory.
[37] Neri Merhav,et al. A Large Deviations Approach to Secure Lossy Compression , 2017, IEEE Trans. Inf. Theory.
[38] Neri Merhav,et al. Relations Between Random Coding Exponents and the Statistical Physics of Random Codes , 2007, IEEE Transactions on Information Theory.
[39] János Körner,et al. How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.
[40] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[41] Neri Merhav,et al. Statistical Physics and Information Theory , 2010, Found. Trends Commun. Inf. Theory.
[42] Aaron D. Wyner,et al. The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.
[43] Suguru Arimoto,et al. An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.
[44] Nadav Shulman,et al. Communication over an unknown channel via common broadcasting , 2003 .
[45] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[46] Rudolf Ahlswede,et al. Good codes can be produced by a few permutations , 1982, IEEE Trans. Inf. Theory.
[47] Michele A. Wigger,et al. On Hypothesis Testing Against Independence with Multiple Decision Centers , 2017, ArXiv.
[48] Neri Merhav,et al. Codeword or Noise? Exact Random Coding Exponents for Joint Detection and Decoding , 2014, IEEE Transactions on Information Theory.
[49] Pablo Piantanida,et al. A new approach to distributed hypothesis testing , 2016, 2016 50th Asilomar Conference on Signals, Systems and Computers.
[50] Rudolf Ahlswede,et al. Hypothesis testing with communication constraints , 1986, IEEE Trans. Inf. Theory.
[51] Neri Merhav,et al. Channel Detection in Coded Communication , 2015, IEEE Transactions on Information Theory.
[52] Hossam M. H. Shalaby,et al. Multiterminal detection with zero-rate data compression , 1992, IEEE Trans. Inf. Theory.
[53] M. Sion. On general minimax theorems , 1958 .
[54] Ligong Wang,et al. Hypothesis Testing In Multi-Hop Networks , 2017, ArXiv.
[55] Jun Chen,et al. On the Reliability Function of Variable-Rate Slepian-Wolf Coding , 2017, Entropy.
[56] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[57] Te Sun Han,et al. Exponential-type error probabilities for multiterminal hypothesis testing , 1989, IEEE Trans. Inf. Theory.
[58] Rudolf Ahlswede,et al. Coloring hypergraphs: A new approach to multi-user source coding, 1 , 1979 .
[59] Richard E. Blahut,et al. Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.
[60] Shun Watanabe,et al. Neyman-Pearson test for zero-rate multiterminal hypothesis testing , 2016, 2017 IEEE International Symposium on Information Theory (ISIT).
[61] Deniz Gündüz,et al. Distributed hypothesis testing over noisy channels , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[62] Imre Csiszár,et al. Information Theory and Statistics: A Tutorial , 2004, Found. Trends Commun. Inf. Theory.
[63] O. F. Cook. The Method of Types , 1898 .
[64] Lifeng Lai,et al. Distributed testing with zero-rate compression , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[65] Sae-Young Chung,et al. Error exponents in asynchronous communication , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[66] Pablo Piantanida,et al. On secure distributed hypothesis testing , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[67] Imre Csisźar,et al. The Method of Types , 1998, IEEE Trans. Inf. Theory.
[68] Imre Csiszár,et al. Towards a general theory of source networks , 1980, IEEE Trans. Inf. Theory.