Information, Divergence and Risk for Binary Experiments
We unify f-divergences, Bregman divergences, surrogate regret bounds, proper scoring rules, cost curves, ROC-curves and statistical information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their representation primitives which all are related to cost-sensitive binary classification. As well as developing relationships between generative and discriminative views of learning, the new machinery leads to tight and more...[Show more]
|Collections||ANU Research Publications|
|Source:||Journal of Machine Learning Research|
|01_Reid_Information,_Divergence_and_2011.pdf||1.47 MB||Adobe PDF||Request a copy|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.