Skip navigation
Skip navigation

The Loss Rank Principle for Model Selection

Hutter, Marcus

Description

A key issue in statistics and machine learning is to automatically select the “right” model complexity, e.g. the number of neighbors to be averaged over in k nearest neighbor (kNN) regression or the polynomial degree in regression with polynomials. We suggest a novel principle (LoRP) for model selection in regression and classification. It is based on the loss rank, which counts how many other (fictitious) data would be fitted better. LoRP selects the model that has minimal loss...[Show more]

dc.contributor.authorHutter, Marcus
dc.date.accessioned2015-08-28T01:21:46Z
dc.date.available2015-08-28T01:21:46Z
dc.identifier.isbn978-3-540-72925-9
dc.identifier.issn0302-9743
dc.identifier.urihttp://hdl.handle.net/1885/15008
dc.description.abstractA key issue in statistics and machine learning is to automatically select the “right” model complexity, e.g. the number of neighbors to be averaged over in k nearest neighbor (kNN) regression or the polynomial degree in regression with polynomials. We suggest a novel principle (LoRP) for model selection in regression and classification. It is based on the loss rank, which counts how many other (fictitious) data would be fitted better. LoRP selects the model that has minimal loss rank. Unlike most penalized maximum likelihood variants (AIC,BIC,MDL), LoRP only depends on the regression functions and the loss function. It works without a stochastic noise model, and is directly applicable to any non-parametric regressor, like kNN.
dc.publisherSpringer Verlag
dc.relation.ispartofLearning Theory : 20th Annual Conference on Learning Theory, COLT 2007, San Diego, CA, USA, June 13-15, 2007, Proceedings
dc.rights© Springer-Verlag Berlin Heidelberg 2007. http://www.sherpa.ac.uk/romeo/issn/0302-9743/..."Author's post-print on any open access repository after 12 months after publication" from SHERPA/RoMEO site (as at 28/08/15)
dc.subjectintelligence definitions
dc.subjectpsychologist
dc.subjectartificial
dc.titleThe Loss Rank Principle for Model Selection
dc.typeConference paper
local.identifier.citationvolume4539
dc.date.issued2007
local.publisher.urlhttp://link.springer.com/
local.type.statusAccepted Version
local.contributor.affiliationHutter, M., Research School of Computer Science, The Australian National University
local.bibliographicCitation.startpage589
local.bibliographicCitation.lastpage603
local.identifier.doi10.1007/978-3-540-72927-3_42
CollectionsANU Research Publications

Download

File Description SizeFormat Image
Legg and Hutter A Collection of Definitions of Intelligence 2007.pdf113.39 kBAdobe PDFThumbnail


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  17 November 2022/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator