Information, Divergence and Risk for Binary Experiments

Date

2011

Authors

Reid, Mark
Williamson, Robert

Journal Title

Journal ISSN

Volume Title

Publisher

MIT Press

Abstract

We unify f-divergences, Bregman divergences, surrogate regret bounds, proper scoring rules, cost curves, ROC-curves and statistical information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their representation primitives which all are related to cost-sensitive binary classification. As well as developing relationships between generative and discriminative views of learning, the new machinery leads to tight and more general surrogate regret bounds and generalised Pinsker inequalities relating f-divergences to variational divergence. The new viewpoint also illuminates existing algorithms: it provides a new derivation of Support Vector Machines in terms of divergences and relates maximum mean discrepancy to Fisher linear discriminants.

Description

Keywords

Keywords: Classification; Divergence; Loss functions; Regret bounds; Statistical information; Machinery; Statistics Classification; Divergence; Loss functions; Regret bounds; Statistical information

Citation

Source

Journal of Machine Learning Research

Type

Journal article

Book Title

Entity type

Access Statement

Open Access

License Rights

DOI

Restricted until