Open Research will be unavailable from 8am to 8.30am on Monday 28th July 2025 due to scheduled maintenance. This maintenance is to provide bug fixes and performance improvements. During this time, you may experience a short outage and be unable to use Open Research.
 

Model selection with the Loss Rank Principle

Date

Authors

Hutter, Marcus
Tran, Minh-Ngoc

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier

Abstract

A key issue in statistics and machine learning is to automatically select the "right" model complexity, e.g., the number of neighbors to be averaged over in k nearest neighbor (k NN) regression or the polynomial degree in regression with polynomials. We suggest a novel principle-the Loss Rank Principle (LoRP)-for model selection in regression and classification. It is based on the loss rank, which counts how many other (fictitious) data would be fitted better. LoRP selects the model that has minimal loss rank. Unlike most penalized maximum likelihood variants (AIC, BIC, MDL), LoRP depends only on the regression functions and the loss function. It works without a stochastic noise model, and is directly applicable to any non-parametric regressor, like k NN.

Description

Citation

Source

Computational Statistics and Data Analysis

Book Title

Entity type

Access Statement

License Rights

Restricted until