Skip navigation
Skip navigation

The AIC criterion and symmetrizing the Kullback-Leibler divergence

Seghouane, Abd-Krim; Amari, Shun-ichi


The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become inconvenient in model selection applications is their lack of symmetry. Simple examples can show that...[Show more]

CollectionsANU Research Publications
Date published: 2007
Type: Journal article
Source: IEEE Transactions on Neural Networks
DOI: 10.1109/TNN.2006.882813


File Description SizeFormat Image
01_Seghouane_The_AIC_criterion_and_2007.pdf643.52 kBAdobe PDF    Request a copy

Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  20 July 2017/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator