Risk-based generalizations of f-divergences
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of/-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.
|Collections||ANU Research Publications|
|Source:||Max-margin Learning for Lower Linear Envelope Potentials in Binary Markov Random Fields|
|01_Garcia-Garcia_Risk-based_generalizations_of_2011.pdf||170.56 kB||Adobe PDF||Request a copy|
|02_Garcia-Garcia_Risk-based_generalizations_of_2011.pdf||272.11 kB||Adobe PDF||Request a copy|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.