Reduced-Complexity Optimization of Distributed Quantization Using the Information Bottleneck Principle

This paper addresses the optimization of distributed compression in a sensor network. A direct communication among the sensors is not possible so that noisy measurements of a single relevant signal have to be locally compressed in order to meet the rate constraints of the communication links to a common receiver. This scenario is widely known as the Chief Executive Officer (CEO) problem and represents a long-standing problem in information theory. In recent years significant progress has been achieved and the rate region has been completely characterized for specific distributions of involved processes and distortion measures. While algorithmic solutions of the CEO problem are principally known, their practical implementation quickly becomes challenging due to complexity reasons. In this contribution, an efficient greedy algorithm to determine feasible solutions of the CEO problem is derived using the information bottleneck (IB) approach. Following the Wyner-Ziv coding principle, the quantizers are successively designed using already optimized quantizer mappings as side-information. However, processing this side-information in the optimization algorithm becomes a major bottleneck because the memory complexity grows exponentially with number of sensors. Therefore, a sequential compression scheme leading to a compact representation of the side-information and ensuring moderate memory requirements even for larger networks is introduced. This internal compression is optimized again by means of the IB method. Numerical results demonstrate that the overall loss in terms of relevant mutual information can be made sufficiently small even with a significant compression of the side-information. The performance is compared to separately optimized quantizers and a centralized quantization. Moreover, the influence of the optimization order for asymmetric scenarios is discussed.

[1]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2005, IEEE Transactions on Information Theory.

[2]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[3]  Toby Berger,et al.  The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.

[4]  Inaki Estella Aguerri,et al.  Vector Gaussian CEO Problem Under Logarithmic Loss , 2018, 2018 IEEE Information Theory Workshop (ITW).

[5]  Qiao Wang,et al.  Rate Region of the Vector Gaussian CEO Problem With the Trace Distortion Constraint , 2016, IEEE Transactions on Information Theory.

[6]  Tsachy Weissman,et al.  Multiterminal Source Coding Under Logarithmic Loss , 2011, IEEE Transactions on Information Theory.

[7]  G. Zeitler Low-precision analog-to-digital conversion and mutual information in channels with memory , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  Toby Berger,et al.  The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.

[10]  Di Chen,et al.  Alternating information bottleneck optimization for the compression in the uplink of C-RAN , 2016, 2016 IEEE International Conference on Communications (ICC).

[11]  Jun Chen,et al.  On the vector Gaussian L-terminal CEO problem , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[12]  藤重 悟 Submodular functions and optimization , 1991 .

[13]  Yasutada Oohama,et al.  Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder , 2005, IEEE Transactions on Information Theory.

[14]  Naftali Tishby,et al.  The information bottleneck method , 2000, ArXiv.

[15]  Zhen Zhang,et al.  On the CEO problem , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[16]  Robert M. Gray,et al.  A unified approach for encoding clean and noisy sources by means of waveform and autoregressive model vector quantization , 1988, IEEE Trans. Inf. Theory.

[17]  Dirk Wübben,et al.  On the relation between the asymptotic performance of different algorithms for information bottleneck framework , 2017, 2017 IEEE International Conference on Communications (ICC).

[18]  Toby Berger,et al.  An upper bound on the sum-rate distortion function and its corresponding rate allocation schemes for the CEO problem , 2004, IEEE Journal on Selected Areas in Communications.

[19]  Inaki Estella Aguerri,et al.  Vector Gaussian CEO Problem Under Logarithmic Loss and Applications , 2018, IEEE Transactions on Information Theory.

[20]  Yasutada Oohama,et al.  The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.

[21]  Michael Gastpar,et al.  Remote Source Coding Under Gaussian Noise: Dueling Roles of Power and Entropy-Power , 2018, 2018 Information Theory and Applications Workshop (ITA).

[22]  David J. Sakrison,et al.  Source encoding in the presence of random disturbance (Corresp.) , 1967, IEEE Trans. Inf. Theory.

[23]  Jack K. Wolf,et al.  Transmission of noisy information to a noisy receiver with minimum distortion , 1970, IEEE Trans. Inf. Theory.

[24]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[25]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[26]  M. Vetterli,et al.  Sensing reality and communicating bits: a dangerous liaison , 2006, IEEE Signal Processing Magazine.

[27]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[28]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[29]  Brian M. Kurkoski,et al.  LDPC Decoding Mappings That Maximize Mutual Information , 2016, IEEE Journal on Selected Areas in Communications.

[30]  C. Shannon Coding Theorems for a Discrete Source With a Fidelity Criterion-Claude , 2009 .

[31]  Shlomo Shamai,et al.  On the capacity of cloud radio access networks with oblivious relaying , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[32]  Gerhard Bauch,et al.  Information-Optimum LDPC Decoders Based on the Information Bottleneck Method , 2018, IEEE Access.

[33]  Inaki Estella Aguerri,et al.  Distributed Information Bottleneck Method for Discrete and Gaussian Sources , 2017, ArXiv.

[34]  Inaki Estella Aguerri,et al.  Distributed Variational Representation Learning , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[35]  Vinod M. Prabhakaran,et al.  Rate region of the quadratic Gaussian CEO problem , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[36]  Gerald Matz,et al.  On irregular LDPC codes with quantized message passing decoding , 2017, 2017 IEEE 18th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[37]  Georg Zeitler,et al.  Low-Precision Quantizer Design for Communication Problems , 2012 .

[38]  Steffen Steiner,et al.  Distributed Compression using the Information Bottleneck Principle , 2021, ICC 2021 - IEEE International Conference on Communications.

[39]  Gerhard Bauch,et al.  Information Bottleneck Graphs for receiver design , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[40]  Inaki Estella Aguerri,et al.  A generalization of blahut-arimoto algorithm to compute rate-distortion regions of multiterminal source coding under logarithmic loss , 2017, 2017 IEEE Information Theory Workshop (ITW).

[41]  Noam Slonim,et al.  The Information Bottleneck : Theory and Applications , 2006 .

[42]  Jun Chen,et al.  On the vector Gaussian CEO problem , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.