Compressive sensing over networks

In this paper, we demonstrate some applications of compressive sensing over networks. We make a connection between compressive sensing and traditional information theoretic techniques in source coding and channel coding. Our results provide an explicit trade-off between the rate and the decoding complexity. The key difference of compressive sensing and traditional information theoretic approaches is at their decoding side. Although optimal decoders to recover the original signal, compressed by source coding have high complexity, the compressive sensing decoder is a linear or convex optimization. First, we investigate applications of compressive sensing on distributed compression of correlated sources. Here, by using compressive sensing, we propose a compression scheme for a family of correlated sources with a modularized decoder, providing a trade-off between the compression rate and the decoding complexity. We call this scheme Sparse Distributed Compression. We use this compression scheme for a general multicast network with correlated sources. Here, we first decode some of the sources by a network decoding technique and then, we use a compressive sensing decoder to obtain the whole sources. Then, we investigate applications of compressive sensing on channel coding. We propose a coding scheme that combines compressive sensing and random channel coding for a high-SNR point-to-point Gaussian channel. We call this scheme Sparse Channel Coding. We propose a modularized decoder providing a trade-off between the capacity loss and the decoding complexity. At the receiver side, first, we use a compressive sensing decoder on a noisy signal to obtain a noisy estimate of the original signal and then, we apply a traditional channel coding decoder to find the original signal.

[1]  Wei Zhong,et al.  Compression of correlated sources using LDPC codes , 2003, Data Compression Conference, 2003. Proceedings. DCC 2003.

[2]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[3]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[4]  Muriel Medard,et al.  Practical source-network decoding , 2009, 2009 6th International Symposium on Wireless Communication Systems.

[5]  E. Candès,et al.  Sparsity and incoherence in compressive sampling , 2006, math/0611957.

[6]  Dimitris Achlioptas,et al.  Database-friendly random projections: Johnson-Lindenstrauss with binary coins , 2003, J. Comput. Syst. Sci..

[7]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUSS): design and construction , 1999 .

[8]  Toby Berger,et al.  Review of Information Theory: Coding Theorems for Discrete Memoryless Systems (Csiszár, I., and Körner, J.; 1981) , 1984, IEEE Trans. Inf. Theory.

[9]  Vahid Tarokh,et al.  Shannon-Theoretic Limits on Noisy Compressive Sampling , 2007, IEEE Transactions on Information Theory.

[10]  R. DeVore,et al.  A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .

[11]  Tracey Ho,et al.  A Random Linear Network Coding Approach to Multicast , 2006, IEEE Transactions on Information Theory.

[12]  Muriel Médard,et al.  An algebraic approach to network coding , 2003, TNET.

[13]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[14]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[15]  Aditya Ramamoorthy,et al.  Separating distributed source coding from network coding , 2006, IEEE Transactions on Information Theory.

[16]  Muriel Médard,et al.  Low-Complexity Approaches to Slepian–Wolf Near-Lossless Distributed Data Compression , 2006, IEEE Transactions on Information Theory.

[17]  Richard G. Baraniuk,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[18]  Ying Zhao,et al.  Compression of correlated binary sources using turbo codes , 2001, IEEE Communications Letters.

[19]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUS): design and construction , 2003, IEEE Trans. Inf. Theory.

[20]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[21]  Bhaskar D. Rao,et al.  Performance tradeoffs for exact support recovery of sparse signals , 2010, 2010 IEEE International Symposium on Information Theory.

[22]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[23]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .