Algorithmic complexity bounds on future prediction errors

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution @m by the algorithmic complexity of @m. Here we assume that we are at a time t>1 and have already observed x=x"1...x"t. We bound the future prediction performance on x"t"+"1x"t"+"2... by a new variant of algorithmic complexity of @m given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

[1]  Jürgen Schmidhuber,et al.  Algorithmic Theories of Everything , 2000, ArXiv.

[2]  Marcus Hutter Sequence Prediction Based on Monotone Complexity , 2003, COLT.

[3]  Marcus Hutter New Error Bounds for Solomonoff Prediction , 2001, J. Comput. Syst. Sci..

[4]  Paul M. B. Vitányi,et al.  Clustering by compression , 2003, IEEE Transactions on Information Theory.

[5]  Jürgen Schmidhuber,et al.  Hierarchies of Generalized Kolmogorov Complexities and Nonenumerable Universal Measures Computable in the Limit , 2002, Int. J. Found. Comput. Sci..

[6]  Marcus Hutter,et al.  Optimality of Universal Bayesian Sequence Prediction for General Loss and Alphabet , 2003, J. Mach. Learn. Res..

[7]  Marcus Hutter,et al.  Monotone Conditional Complexity Bounds on Future Prediction Errors , 2005, ALT.

[8]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[9]  Marcus Hutter,et al.  Convergence of Discrete MDL for Sequential Prediction , 2004, COLT.

[10]  Marcus Hutter,et al.  Universal Convergence of Semimeasures on Individual Random Sequences , 2004, ALT.

[11]  L. Levin,et al.  THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS , 1970 .

[12]  Ofi rNw8x'pyzm,et al.  The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions , 2002 .

[13]  Marcus Hutter,et al.  Universal Artificial Intellegence - Sequential Decisions Based on Algorithmic Probability , 2005, Texts in Theoretical Computer Science. An EATCS Series.

[14]  Alexander Shen,et al.  Relations between varieties of kolmogorov complexities , 1996, Mathematical systems theory.

[15]  Marcus Hutter Convergence and Loss Bounds for Bayesian Sequence Prediction , 2003, IEEE Trans. Inf. Theory.

[16]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[17]  Marcus Hutter Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences , 2001, ECML.

[18]  Marcus Hutter Optimality of universal Bayesian prediction for general loss and alphabet , 2003 .

[19]  Marcus Hutter General Loss Bounds for Universal Sequence Prediction , 2001, ICML.

[20]  Ray J. Solomonoff,et al.  Complexity-based induction systems: Comparisons and convergence theorems , 1978, IEEE Trans. Inf. Theory.

[21]  Marcus Hutter Sequential Predictions based on Algorithmic Complexity , 2006, J. Comput. Syst. Sci..