Der bekannte Grenzwert der redundanzfreien Information in Texten - eine Fehlinterpretation der Shannonschen Experimente?
暂无分享,去创建一个
It is shown that the redundancy-free value of specific information (entropy) in natural language texts, measured in bit per letter, cannot be a constant value. Rather it should decline monotonously with growing length of texts. Surprisingly this fact has been mentioned incidentally by Shannon already in his famous paper in printed English. Now the region is determined numerically in wich the function of minimum entropy can be situated. A first approximation for this function leads to integrated information values, which are proportionnal to the square root of text length.