Meaningful Information

The information in an individual finite object (like a binary string) is commonly measured by its Kolmogorov complexity. One can divide that information into two parts: the information accounting for the useful regularity present in the object and the information accounting for the remaining accidental information. There can be several ways (model classes) in which the regularity is expressed. Kolmogorov has proposed the model class of finite sets, generalized later to computable probability mass functions. The resulting theory, known as Algorithmic Statistics, analyzes the algorithmic sufficient statistic when the statistic is restricted to the given model class. However, the most general way to proceed is perhaps to express the useful information as a total recursive function. The resulting measure has been called the "sophistication" of the object. We develop the theory of recursive functions statistic, the maximum and minimum value, the existence of absolutely nonstochastic objects (that have maximal sophistication-all the information in them is meaningful and there is no residual randomness), determine its relation with the more restricted model classes of finite sets, and computable probability distributions, in particular with respect to the algorithmic (Kolmogorov) minimal sufficient statistic, the relation to the halting problem and further algorithmic properties

[1]  Paul M. B. Vitányi,et al.  Shannon Information and Kolmogorov Complexity , 2004, ArXiv.

[2]  Nikolai K. Vereshchagin,et al.  Kolmogorov's structure functions and model selection , 2002, IEEE Transactions on Information Theory.

[3]  Nikolai Vereshchagin,et al.  Rate Distortion Theory for Individual Data , 2004 .

[4]  Nikolai K. Vereshchagin,et al.  Kolmogorov's structure functions with an application to the foundations of model selection , 2002, The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002. Proceedings..

[5]  Péter Gács,et al.  Algorithmic statistics , 2000, IEEE Trans. Inf. Theory.

[6]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[7]  Ming Li,et al.  Applying MDL to learn best model granularity , 2000, Artif. Intell..

[8]  Ming Li,et al.  Minimum description length induction, Bayesianism, and Kolmogorov complexity , 1999, IEEE Trans. Inf. Theory.

[9]  Alexander Shen,et al.  Discussion on Kolmogorov Complexity and Statistical Analysis , 1999, Comput. J..

[10]  Vladimir V. V'yugin,et al.  Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences , 1999, Comput. J..

[11]  Jorma Rissanen,et al.  The Minimum Description Length Principle in Coding and Modeling , 1998, IEEE Trans. Inf. Theory.

[12]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[13]  P. Kidwell,et al.  The universal turing machine: a half-century survey , 1996, IEEE Annals of the History of Computing.

[14]  J. Sutherland The Quark and the Jaguar , 1994 .

[15]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[16]  Rolf Herken,et al.  The Universal Turing Machine: A Half-Century Survey , 1992 .

[17]  Roland Sauerbrey,et al.  Biography , 1992, Ann. Pure Appl. Log..

[18]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[19]  V. V'yugin On the Defect of Randomness of a Finite Object with Respect to Measures with Given Complexity Bounds , 1988 .

[20]  A. Kolmogorov,et al.  ALGORITHMS AND RANDOMNESS , 1988 .

[21]  Moshe Koppel,et al.  Complexity, Depth, and Sophistication , 1987, Complex Syst..

[22]  T. Cover Kolmogorov Complexity, Data Compression, and Inference , 1985 .

[23]  J. K. Skwirzynski The impact of processing techniques on communications , 1985 .

[24]  A. Kolmogorov On Logical Foundations of Probability Theory , 1983 .

[25]  J.-M. Goethals,et al.  IEEE international symposium on information theory , 1981 .

[26]  J. Lukasiewicz Logical foundations of probability theory , 1970 .

[27]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[28]  Claude E. Shannon,et al.  The mathematical theory of communication , 1950 .

[29]  R. Fisher,et al.  On the Mathematical Foundations of Theoretical Statistics , 1922 .