MDL convergence speed for Bernoulli sequences
The Minimum Description Length principle for online sequence estimation/prediction in a proper learning setup is studied. If the underlying model class is discrete, then the total expected square loss is a particularly interesting performance measure: (a) this quantity is finitely bounded, implying convergence with probability one, and (b) it additionally specifies the convergence speed. For MDL, in general one can only have loss bounds which are finite but exponentially larger than those for...[Show more]
|Collections||ANU Research Publications|
|Source:||Statistics and Computing|
|Poland and Hutter MDL Convergence Speed 2006.pdf||308.92 kB||Adobe PDF|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.