Estimating with randomized encoding the joint empirical distribution in a correlated source
暂无分享,去创建一个
[1] Ning Cai,et al. On interactive communication , 1997, IEEE Trans. Inf. Theory.
[2] Jack K. Wolf,et al. Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.
[3] Rudolf Ahlswede,et al. Identification in the presence of feedback-A discovery of new capacity formulas , 1989, IEEE Trans. Inf. Theory.
[4] Rudolf Ahlswede,et al. To get a bit of information may be as hard as to get full information , 1981, IEEE Trans. Inf. Theory.
[5] Aaron D. Wyner,et al. The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.
[6] Rudolf Ahlswede,et al. Channel capacities for list codes , 1973, Journal of Applied Probability.
[7] Rudolf Ahlswede,et al. Identification via channels , 1989, IEEE Trans. Inf. Theory.
[8] Rudolf Ahlswede,et al. On minimax estimation in the presence of side information about remote data , 1990 .
[9] T. Cover,et al. Rate Distortion Theory , 2001 .
[10] Rudolf Ahlswede,et al. Identification via compressed data , 1997, IEEE Trans. Inf. Theory.
[11] Rudolf Ahlswede,et al. Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.