On the symmetric information rate of two-dimensional finite-state ISI channels
暂无分享,去创建一个
[1] Hans-Andrea Loeliger,et al. On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).
[2] V. Sharma,et al. Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).
[3] Walter Hirt. Capacity and information rates of discrete-time channels with memory , 1988 .
[4] Dimitris Anastassiou,et al. Some results regarding the entropy rate of random fields , 1982, IEEE Trans. Inf. Theory.
[5] P. Siegel,et al. Information rates of two-dimensional finite state ISI channels , 2003, IEEE International Symposium on Information Theory, 2003. Proceedings..
[6] Shlomo Shamai,et al. The intersymbol interference channel: lower bounds on capacity and channel precoding loss , 1996, IEEE Trans. Inf. Theory.
[7] Paul H. Siegel,et al. On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).
[8] Rainer Storn,et al. Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..
[9] Sei-ichiro Kamata,et al. An address generator, for an N-dimensional pseudo-Hilbert scan in a hyper-rectangular, parallelepiped region , 2000, Proceedings 2000 International Conference on Image Processing (Cat. No.00CH37101).
[10] Paul H. Siegel,et al. On the symmetric information rate of two-dimensional finite state ISI channels , 2003, Proceedings 2003 IEEE Information Theory Workshop (Cat. No.03EX674).
[11] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[12] Hans-Otto Georgii,et al. Gibbs Measures and Phase Transitions , 1988 .
[13] Demetri Psaltis. Holographic memories , 1996, International Commission for Optics.
[14] James L. Massey,et al. Capacity of the discrete-time Gaussian channel with intersymbol interference , 1988, IEEE Trans. Inf. Theory.
[15] John Cocke,et al. Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.
[16] Wei Zeng,et al. Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.
[17] Shlomo Shamai,et al. Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.
[18] Zheng Zhang,et al. On information rates of single-track and multi-track magnetic recording channels with intertrack interference , 2002, Proceedings IEEE International Symposium on Information Theory,.
[19] Hans-Andrea Loeliger,et al. Computation of Information Rates from Finite-State Source/Channel Models , 2002 .
[20] Benjamin Weiss,et al. Commuting measure-preserving transformations , 1972 .
[21] Abraham Lempel,et al. Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.
[22] Abraham Lempel,et al. Compression of two-dimensional data , 1986, IEEE Trans. Inf. Theory.
[23] John J. Birch. Approximations for the Entropy for Functions of Markov Chains , 1962 .
[24] Andries P. Hekstra,et al. Signal processing and coding for two-dimensional optical storage , 2003, GLOBECOM '03. IEEE Global Telecommunications Conference (IEEE Cat. No.03CH37489).
[25] Kenneth Zeger,et al. The capacity of some hexagonal (d,k)-constraints , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).