On the asymptotic rate of non-ergodic information sources
暂无分享,去创建一个
Before proceeding to the formulation of the problem we are treating in this paper, we shall remind some simple facts upon which the concepts given in the sequel are based. Let us imagine such a situation when a receiver is expecting one of the messages z belonging to a finite set Z of messages which are possible to come under the given situation. The uncertainty of the situation described is evidently the greater, the larger is the number of all a priori possible messages; let us denote the latter number, i.e. the number of messages in Z, by \z\. Originally the quantity of information which is needed to remove this uncertainty, is numerically expressed by the number log \z\, where the logarithm.is taken to the base 2; in other words, log \z\ represents the quantity of information expressed in bits which is contained in any message z in Z reaching the receiver. Consequently, the number log \Z\ as a measure of uncertainty may be called the logarithmic uncertainty of the set Z.
[1] Aleksandr Yakovlevich Khinchin,et al. Mathematical foundations of information theory , 1959 .
[2] B. McMillan. The Basic Theorems of Information Theory , 1953 .
[3] N. Kryloff,et al. La Theorie Generale De La Mesure Dans Son Application A L'Etude Des Systemes Dynamiques De la Mecanique Non Lineaire , 1937 .
[4] K. Parthasarathy. On the integral representation of the rate of transmission of a stationary channel , 1961 .