A coding theorem for time-discrete analog data sources
暂无分享,去创建一个
The main result of the paper is a coding theorem for time-discrete sources with a fidelity criterion on average distortion. This result applies to a subclass of the ergodic sources (stationary sources having a strong mixing property) and to the cases in which source samples are continuously distributed (analog) quantities or discrete symbols. An intuitive definition of the information content of data samples for this class of sources is then formally established and is shown to be exactly the same as Shannon's definition.
[1] Goblick. Bounds on the efficiency of analog-to-digital conversion , 1967 .
[2] David J. Sakrison,et al. A geometric treatment of the source encoding of a Gaussian random variable , 1968, IEEE Trans. Inf. Theory.
[3] Kerns H. Powers. A unified theory of information , 1956 .
[4] Amiel Feinstein,et al. Information and information stability of random variables and processes , 1964 .
[5] R. Gallager. Information Theory and Reliable Communication , 1968 .