Algorithmic complexity bounds on future prediction errors

Date

2007-02

Authors

Chernov, Alexey
Hutter, Marcus
Schmidhuber, Jürgen

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier

Abstract

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ. Here we assume that we are at a time t > 1 and have already observed x = x1 ⋯ xt. We bound the future prediction performance on x(t+1)x(t+2) ⋯ by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

Description

Keywords

Kolmogorov complexity, posterior bounds, online sequential prediction, Solomonoff prior, monotone conditional complexity

Citation

Source

Information and Computation

Type

Journal article

Book Title

Entity type

Access Statement

Open Access

License Rights

Restricted until