On Distributed Compression of Linear Functions
暂无分享,去创建一个
[1] Jack K. Wolf,et al. Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.
[2] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[3] Meir Feder,et al. On lattice quantization noise , 1996, IEEE Trans. Inf. Theory.
[4] Suhas N. Diggavi,et al. Approximating the Gaussian Multiple Description Rate Region Under Symmetric Distortion Constraints , 2009, IEEE Transactions on Information Theory.
[5] Ram Zamir,et al. The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.
[6] L. Ozarow,et al. On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.
[7] Yasutada Oohama,et al. Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder , 2005, IEEE Transactions on Information Theory.
[8] Bernd Girod,et al. Distributed Video Coding , 2005, Proceedings of the IEEE.
[9] Shlomo Shamai,et al. Nested linear/Lattice codes for structured multiterminal binning , 2002, IEEE Trans. Inf. Theory.
[10] Shun-ichi Amari,et al. Statistical Inference Under Multiterminal Data Compression , 1998, IEEE Trans. Inf. Theory.
[11] Michael Gastpar,et al. The case for structured random codes in network capacity theorems , 2008, Eur. Trans. Telecommun..
[12] R. Durrett. Probability: Theory and Examples , 1993 .
[13] Muriel Médard,et al. A SIMO Fiber Aided Wireless Network Architecture , 2006, 2006 IEEE International Symposium on Information Theory.
[14] Thomas M. Cover,et al. A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .
[15] A. Wagner,et al. The lossy one-helper conjecture is false , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[16] Suhas N. Diggavi,et al. Approximate capacity of Gaussian relay networks , 2008, 2008 IEEE International Symposium on Information Theory.
[17] Sriram Vishwanath,et al. Communicating the Difference of Correlated Gaussian Sources over a MAC , 2009, 2009 Data Compression Conference.
[18] P. Viswanath,et al. The Gaussian Many-Help-One Distributed Source Coding Problem , 2008, 2006 IEEE Information Theory Workshop - ITW '06 Chengdu.
[19] Yasutada Oohama. Gaussian multiterminal source coding , 1997, IEEE Trans. Inf. Theory.
[20] Yasutada Oohama,et al. Distributed Source Coding of Correlated Gaussian Remote Sources , 2008, IEEE Transactions on Information Theory.
[21] Michael Gastpar,et al. Cooperative strategies and capacity theorems for relay networks , 2005, IEEE Transactions on Information Theory.
[22] P. M. Ebert,et al. The capacity of the Gaussian channel with feedback , 1970, Bell Syst. Tech. J..
[23] Venkat Anantharam,et al. An improved outer bound for multiterminal source coding , 2008, IEEE Transactions on Information Theory.
[24] Toby Berger,et al. The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.
[25] Michael Gastpar,et al. The Wyner-Ziv problem with multiple sources , 2004, IEEE Transactions on Information Theory.
[26] Shlomo Shamai,et al. Communication Via Decentralized Processing , 2005, IEEE Transactions on Information Theory.
[27] Vittorio Castelli,et al. Near sufficiency of random coding for two descriptions , 2006, IEEE Transactions on Information Theory.
[28] Simon Litsyn,et al. Lattices which are good for (almost) everything , 2005, IEEE Transactions on Information Theory.
[29] János Körner,et al. How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.
[30] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[31] A. Lapidoth. On the role of mismatch in rate distortion theory , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.
[32] Vinod M. Prabhakaran,et al. Rate region of the quadratic Gaussian CEO problem , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..
[33] Hua Wang,et al. Gaussian Interference Channel Capacity to Within One Bit , 2007, IEEE Transactions on Information Theory.
[34] Aaron B. Wagner,et al. An outer bound for distributed compression of linear functions , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.
[35] S. Sandeep Pradhan,et al. Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function , 2007, IEEE Transactions on Information Theory.
[36] Joseph A. O'Sullivan,et al. Achievable Rates for Pattern Recognition , 2005, IEEE Transactions on Information Theory.
[37] Meir Feder,et al. Information rates of pre/post-filtered dithered quantizers , 1993, IEEE Trans. Inf. Theory.
[38] Pramod Viswanath,et al. Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2005, IEEE Transactions on Information Theory.
[39] Uri Erez,et al. Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding , 2004, IEEE Transactions on Information Theory.
[40] Thomas M. Cover,et al. Gaussian feedback capacity , 1989, IEEE Trans. Inf. Theory.
[41] Abbas El Gamal,et al. Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.
[42] Hua Wang,et al. Vector Gaussian Multiple Description With Individual and Central Receivers , 2005, IEEE Transactions on Information Theory.