Function Computation over Networks: Efficient Information Processing for Cache and Sensor Applications

This thesis looks at efficient information processing for two network applications: content delivery with caching and collecting summary statistics in wireless sensor networks. Both applications are studied under the same paradigm: function computation over networks, where distributed source nodes cooperatively communicate some functions of individual observations to one or multiple destinations. One approach that always works is to convey all observations and then let the destinations compute the desired functions by themselves. However, if the available communication resources are limited, then revealing less unwanted information becomes critical. Centered on this goal, this thesis develops new coding schemes using informationtheoretic tools. The first part of this thesis focuses on content delivery with caching. Caching is a technique that facilitates reallocation of communication resources in order to avoid network congestion during peak-traffic times. An information-theoretic model, termed sequential coding for computing, is proposed to analyze the potential gains offered by the caching technique. For the single-user case, the proposed framework succeeds in verifying the optimality of some simple caching strategies and in providing guidance towards optimal caching strategies. For the two-user case, five representative subproblems are considered, which draw connections with classic source coding problems including the Gray–Wyner system, successive refinement, and the Kaspi/Heegard–Berger problem. Afterwards, the problem of distributed computing with successive refinement is considered. It is shown that if full data recovery is required in the second stage of successive refinement, then any information acquired in the first stage will be useful later in the second stage. The second part of this thesis looks at the collection of summary statistics in wireless sensor networks. Summary statistics include arithmetic mean, median, standard deviation, etc, and they belong to the class of symmetric functions. This thesis develops arithmetic computation coding in order to efficiently perform innetwork computation for weighted arithmetic sums and symmetric functions. The developed arithmetic computation coding increases the achievable computation rate from Θ((logL)/L) to Θ(1/ logL), where L is the number of sensors. Finally, this thesis demonstrates that interaction among sensors is beneficial for computation of type-threshold functions, e.g., the maximum and the indicator function, and that a non-vanishing computation rate is achievable.

[1]  Jaime Llorca,et al.  Caching and coded multicasting: Multiple groupcast index coding , 2014, 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP).

[2]  Alex J. Grant,et al.  Rate Distortion With Side-Information at Many Decoders , 2009, IEEE Transactions on Information Theory.

[3]  Michael Gastpar,et al.  Approximate ergodic capacity of a class of fading 2 × 2 × 2 Networks , 2012, 2012 Information Theory and Applications Workshop.

[4]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[5]  Michael Gastpar,et al.  Computation Over Multiple-Access Channels , 2007, IEEE Transactions on Information Theory.

[6]  Jaime Llorca,et al.  Order-Optimal Rate of Caching and Coded Multicasting With Random Demands , 2015, IEEE Transactions on Information Theory.

[7]  Michael Gastpar,et al.  Computation over Gaussian networks with orthogonal components , 2013, 2013 IEEE International Symposium on Information Theory.

[8]  Michael Gastpar,et al.  Approximate Ergodic Capacity of a Class of Fading Two-User Two-Hop Networks , 2014, IEEE Transactions on Information Theory.

[9]  Toby Berger,et al.  Sequential coding of correlated sources , 2000, IEEE Trans. Inf. Theory.

[10]  Toby Berger,et al.  Rate distortion when side information may be absent , 1985, IEEE Trans. Inf. Theory.

[11]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .

[12]  Suhas N. Diggavi,et al.  Side-Information Scalable Source Coding , 2007, IEEE Transactions on Information Theory.

[13]  Piyush Gupta,et al.  Interactive Source Coding for Function Computation in Collocated Networks , 2012, IEEE Transactions on Information Theory.

[14]  Kerstin Vogler,et al.  Table Of Integrals Series And Products , 2016 .

[15]  Yaming Yu,et al.  Sharp Bounds on the Entropy of the Poisson Law and Related Quantities , 2010, IEEE Transactions on Information Theory.

[16]  William Equitz,et al.  Successive refinement of information , 1991, IEEE Trans. Inf. Theory.

[17]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[18]  Michael Gastpar,et al.  Compute-and-Forward: Harnessing Interference Through Structured Codes , 2009, IEEE Transactions on Information Theory.

[19]  Imre Csiszár Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.

[20]  Michael Gastpar,et al.  Interactive Computation of Type-Threshold Functions in Collocated Gaussian Networks , 2015, IEEE Transactions on Information Theory.

[21]  Pravin Varaiya,et al.  Capacity of fading channels with channel side information , 1997, IEEE Trans. Inf. Theory.

[22]  Urs Niesen,et al.  Coded caching for delay-sensitive content , 2014, 2015 IEEE International Conference on Communications (ICC).

[23]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[24]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[25]  Michael Gastpar,et al.  On distributed successive refinement with lossless recovery , 2014, 2014 IEEE International Symposium on Information Theory.

[26]  Suhas N. Diggavi,et al.  Hierarchical coded caching , 2014, 2014 IEEE International Symposium on Information Theory.

[27]  Peter Harremoës,et al.  Binomial and Poisson distributions as maximum entropy distributions , 2001, IEEE Trans. Inf. Theory.

[28]  Michael Gastpar,et al.  Information-theoretic caching , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[29]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[30]  Rudolf Ahlswede,et al.  On source coding with side information via a multiple-access channel and related problems in multi-user information theory , 1983, IEEE Trans. Inf. Theory.

[31]  Hans S. Witsenhausen,et al.  A conditional entropy bound for a pair of discrete random variables , 1975, IEEE Trans. Inf. Theory.

[32]  Andrea J. Goldsmith,et al.  Source and Channel Coding for Correlated Sources Over Multiuser Channels , 2008, IEEE Transactions on Information Theory.

[33]  Panganamala Ramana Kumar,et al.  Computing and communicating functions over sensor networks , 2005, IEEE Journal on Selected Areas in Communications.

[34]  Urs Niesen,et al.  Fundamental limits of caching , 2012, 2013 IEEE International Symposium on Information Theory.

[35]  Kannan Ramchandran,et al.  Distributed code constructions for the entire Slepian-Wolf rate region for arbitrarily correlated sources , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[36]  Amiram H. Kaspi,et al.  Rate-distortion function when side-information may be present at the decoder , 1994, IEEE Trans. Inf. Theory.

[37]  Bixio Rimoldi,et al.  Successive refinement of information: characterization of the achievable rates , 1994, IEEE Trans. Inf. Theory.

[38]  Urs Niesen,et al.  Coded Caching With Nonuniform Demands , 2017, IEEE Transactions on Information Theory.

[39]  Abdellatif Zaidi,et al.  Rate-Distortion Function for a Heegard-Berger Problem With Two Sources and Degraded Reconstruction Sets , 2016, IEEE Transactions on Information Theory.

[40]  Urs Niesen,et al.  Online Coded Caching , 2013, IEEE/ACM Transactions on Networking.

[41]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[42]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[43]  Naftali Tishby,et al.  The information bottleneck method , 2000, ArXiv.

[44]  Michael Gastpar,et al.  Multi-round computation of type-threshold functions in collocated Gaussian networks , 2013, 2013 IEEE International Symposium on Information Theory.

[45]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[46]  Suhas N. Diggavi,et al.  Coded Caching for Heterogeneous Wireless Networks with Multi-level Access , 2014, ArXiv.

[47]  T. Charles Clancy,et al.  Fundamental Limits of Caching With Secure Delivery , 2013, IEEE Transactions on Information Forensics and Security.

[48]  Michael Gastpar,et al.  Information-Theoretic Caching: Sequential Coding for Computing , 2015, IEEE Transactions on Information Theory.

[49]  Michele A. Wigger,et al.  Utility of encoder side information for the lossless Kaspi/Heegard-B erger problem , 2013, 2013 IEEE International Symposium on Information Theory.

[50]  Ingram Olkin,et al.  Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution , 1981 .

[51]  Tobias J. Oechtering,et al.  Source Coding Problems With Conditionally Less Noisy Side Information , 2012, IEEE Transactions on Information Theory.

[52]  Prakash Ishwar,et al.  Some Results on Distributed Source Coding for Interactive Function Computation , 2011, IEEE Transactions on Information Theory.

[53]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[54]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[55]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[56]  Andrea Sgarro,et al.  Source coding with side information at several decoders , 1977, IEEE Trans. Inf. Theory.

[57]  Neri Merhav,et al.  On successive refinement for the Wyner-Ziv problem , 2004, IEEE Transactions on Information Theory.

[58]  Amin Gohari,et al.  On Körner-Marton's sum modulo two problem , 2015, 2015 Iran Workshop on Communication and Information Theory (IWCIT).

[59]  Thomas M. Cover,et al.  A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .

[60]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.

[61]  Xinbing Wang,et al.  Coded caching under arbitrary popularity distributions , 2015, 2015 Information Theory and Applications Workshop (ITA).

[62]  Gerd Ascheid,et al.  Asymptotic Coded BER Analysis for MIMO BICM-ID with Quantized Extrinsic LLR , 2012, IEEE Transactions on Communications.

[63]  Giuseppe Caire,et al.  Wireless Device-to-Device Caching Networks: Basic Principles and System Performance , 2013, IEEE Journal on Selected Areas in Communications.

[64]  Gerd Ascheid,et al.  BER analysis for MIMO BICM-ID assuming finite precision of extrinsic LLR , 2010, 2010 International Symposium On Information Theory & Its Applications.

[65]  Urs Niesen,et al.  Decentralized coded caching attains order-optimal memory-rate tradeoff , 2013, Allerton.