On Binary Distributed Hypothesis Testing

We consider the problem of distributed binary hypothesis testing of two sequences that are generated by an i.i.d. doubly-binary symmetric source. Each sequence is observed by a different terminal. The two hypotheses correspond to different levels of correlation between the two source components, i.e., the crossover probability between the two. The terminals communicate with a decision function via rate-limited noiseless links. We analyze the tradeoff between the exponential decay of the two error probabilities associated with the hypothesis test and the communication rates. We first consider the side-information setting where one encoder is allowed to send the full sequence. For this setting, previous work exploits the fact that a decoding error of the source does not necessarily lead to an erroneous decision upon the hypothesis. We provide improved achievability results by carrying out a tighter analysis of the effect of binning error; the results are also more complete as they cover the full exponent tradeoff and all possible correlations. We then turn to the setting of symmetric rates for which we utilize Korner-Marton coding to generalize the results, with little degradation with respect to the performance with a one-sided constraint (side-information setting).

[1]  Shun-ichi Amari On Optimal Data Compression in Multiterminal Statistical Inference , 2011, IEEE Transactions on Information Theory.

[2]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[3]  Te Han,et al.  Hypothesis testing with multiterminal data compression , 1987, IEEE Trans. Inf. Theory.

[4]  Imre Csiszár Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.

[5]  Aaron B. Wagner On Distributed Compression of Linear Functions , 2011, IEEE Transactions on Information Theory.

[6]  Eli Haim,et al.  Distributed Structure: Joint Expurgation for the Multiple-Access Channel , 2012, IEEE Transactions on Information Theory.

[7]  Rudolf Ahlswede,et al.  Hypothesis testing with communication constraints , 1986, IEEE Trans. Inf. Theory.

[8]  Imre Csiszár,et al.  Towards a general theory of source networks , 1980, IEEE Trans. Inf. Theory.

[9]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[10]  S.-I. Amari,et al.  Multiterminal estimation theory with binary symmetric source , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[11]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[12]  Yury Polyanskiy,et al.  Hypothesis testing via a comparator , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[13]  Neri Merhav,et al.  Optimum trade-offs between error exponent and excess-rate exponent of Slepian-Wolf coding , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[14]  Hossam M. H. Shalaby,et al.  Multiterminal detection with zero-rate data compression , 1992, IEEE Trans. Inf. Theory.

[15]  S. Amari,et al.  Error bound of hypothesis testing with data compression , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[16]  Aaron B. Wagner,et al.  Reliability in Source Coding With Side Information , 2011, IEEE Transactions on Information Theory.

[17]  Jerzy Neyman,et al.  The testing of statistical hypotheses in relation to probabilities a priori , 1933, Mathematical Proceedings of the Cambridge Philosophical Society.

[18]  S. Sandeep Pradhan,et al.  Distributed Source Coding Using Abelian Group Codes: A New Achievable Rate-Distortion Region , 2011, IEEE Transactions on Information Theory.

[19]  R. Gallager Information Theory and Reliable Communication , 1968 .

[20]  Shlomo Shamai,et al.  Nested linear/Lattice codes for structured multiterminal binning , 2002, IEEE Trans. Inf. Theory.

[21]  Rudolf Ahlswede,et al.  To get a bit of information may be as hard as to get full information , 1981, IEEE Trans. Inf. Theory.

[22]  Te Sun Han,et al.  Exponential-type error probabilities for multiterminal hypothesis testing , 1989, IEEE Trans. Inf. Theory.

[23]  Aaron D. Wyner,et al.  Recent results in the Shannon theory , 1974, IEEE Trans. Inf. Theory.

[24]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[25]  Lifeng Lai,et al.  Are Slepian-Wolf rates necessary for distributed parameter estimation? , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[26]  Sergio VerdÂ,et al.  Statistical Inference Under Multiterminal Data Compression , 2000 .

[27]  Pablo Piantanida,et al.  A new approach to distributed hypothesis testing , 2016, 2016 50th Asilomar Conference on Signals, Systems and Computers.

[28]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[29]  Jun Chen,et al.  On the Reliability Function of Variable-Rate Slepian-Wolf Coding , 2017, Entropy.

[30]  A. Wagner,et al.  On the Optimality of Binning for Distributed Hypothesis Testing , 2010, IEEE Transactions on Information Theory.

[31]  Aaron B. Wagner,et al.  Distributed Rate-Distortion With Common Components , 2011, IEEE Transactions on Information Theory.

[32]  Mérouane Debbah,et al.  Distributed Binary Detection With Lossy Data Compression , 2016, IEEE Transactions on Information Theory.