Skip navigation
Skip navigation

Asymptotics of discrete MDL for online prediction

Poland, Jan; Hutter, Marcus

Description

Minimum Description Length (MDL) is an important principle for induction and prediction, with strong relations to optimal Bayesian learning. This paper deals with learning non-i.i.d. processes by means of two-part MDL, where the underlying model class is countable. We consider the online learning framework, i.e. observations come in one by one, and the predictor is allowed to update his state of mind after each time step. We identify two ways of predicting by MDL for this setup, namely a...[Show more]

dc.contributor.authorPoland, Jan
dc.contributor.authorHutter, Marcus
dc.date.accessioned2015-08-31T04:35:09Z
dc.date.available2015-08-31T04:35:09Z
dc.identifier.issn0018-9448
dc.identifier.urihttp://hdl.handle.net/1885/15036
dc.description.abstractMinimum Description Length (MDL) is an important principle for induction and prediction, with strong relations to optimal Bayesian learning. This paper deals with learning non-i.i.d. processes by means of two-part MDL, where the underlying model class is countable. We consider the online learning framework, i.e. observations come in one by one, and the predictor is allowed to update his state of mind after each time step. We identify two ways of predicting by MDL for this setup, namely a static} and a dynamic one. (A third variant, hybrid MDL, will turn out inferior.) We will prove that under the only assumption that the data is generated by a distribution contained in the model class, the MDL predictions converge to the true values almost surely. This is accomplished by proving finite bounds on the quadratic, the Hellinger, and the Kullback-Leibler loss of the MDL learner, which are however exponentially worse than for Bayesian prediction. We demonstrate that these bounds are sharp, even for model classes containing only Bernoulli distributions. We show how these bounds imply regret bounds for arbitrary loss functions. Our results apply to a wide range of setups, namely sequence prediction, pattern classification, regression, and universal induction in the sense of Algorithmic Information Theory among others.
dc.publisherIEEE
dc.rightshttp://www.sherpa.ac.uk/romeo/issn/0018-9448/..."Author's post-print on Author's server or Institutional server" from SHERPA/RoMEO site (as at 31/08/15).
dc.rights© 2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
dc.sourceIEEE Transactions on Information Theory, 51:11 (2005) 3780-3795
dc.titleAsymptotics of discrete MDL for online prediction
dc.typeJournal article
local.identifier.citationvolume51
dc.date.issued2005-06-08
local.publisher.urlhttp://www.ieee.org/index.html
local.type.statusAccepted Version
local.contributor.affiliationHutter, M., Research School of Computer Science, The Australian National University
local.bibliographicCitation.issue11
local.bibliographicCitation.startpage3780
local.bibliographicCitation.lastpage3795
local.identifier.doi10.1109/TIT.2005.856956
CollectionsANU Research Publications

Download

File Description SizeFormat Image
Poland and Hutter Asymptotics of Discrete MDL 2005.pdf376.32 kBAdobe PDFThumbnail


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  19 May 2020/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator