Composite binary losses

dc.contributor.authorReid, Mark
dc.contributor.authorWilliamson, Robert
dc.date.accessioned2015-12-10T23:05:53Z
dc.date.issued2010
dc.date.updated2016-02-24T08:32:02Z
dc.description.abstractWe study losses for binary classification and class probability estimation and extend the understanding of them from margin losses to general composite losses which are the composition of a proper loss with a link function. We characterise when margin losses can be proper composite losses, explicitly show how to determine a symmetric loss in full from half of one of its partial losses, introduce an intrinsic parametrisation of composite binary losses and give a complete characterisation of the relationship between proper losses and "classification calibrated" losses. We also consider the question of the "best" surrogate binary loss. We introduce a precise notion of "best" and show there exist situations where two convex surrogate losses are incommensurable. We provide a complete explicit characterisation of the convexity of composite binary losses in terms of the link function and the weight function associated with the proper loss which make up the composite loss. This characterisation suggests new ways of "surrogate tuning" as well as providing an explicit characterisation of when Bregman divergences on the unit interval are convex in their second argument. Finally, in an appendix we present some new algorithm-independent results on the relationship between properness, convexity and robustness to misclassification noise for binary losses and show that all convex proper losses are non-robust to misclassification noise.
dc.identifier.issn1532-4435
dc.identifier.urihttp://hdl.handle.net/1885/62547
dc.publisherMIT Press
dc.sourceJournal of Machine Learning Research
dc.source.urihttp://jmlr.csail.mit.edu/papers/v11/reid10a.html
dc.subjectKeywords: Bregman divergences; Classification; Classification-calibrated; Convexity; Fisher consistency; Misclassifications; Probability estimation; Proper scoring rules; Regret bound; Robustness; Surrogate loss; Estimation; Function evaluation; Probability Bregman divergence; Classification; Classification-calibrated; Convexity; Fisher consistency; Misclassification noise; Probability estimation; Proper scoring rule; Regret bound; Robustness; Surrogate loss
dc.titleComposite binary losses
dc.typeJournal article
local.bibliographicCitation.lastpage2422
local.bibliographicCitation.startpage2387
local.contributor.affiliationReid, Mark, College of Engineering and Computer Science, ANU
local.contributor.affiliationWilliamson, Robert, College of Engineering and Computer Science, ANU
local.contributor.authoruidReid, Mark, u4466898
local.contributor.authoruidWilliamson, Robert, u9000163
local.description.embargo2037-12-31
local.description.notesImported from ARIES
local.identifier.absfor080109 - Pattern Recognition and Data Mining
local.identifier.absseo970108 - Expanding Knowledge in the Information and Computing Sciences
local.identifier.ariespublicationf2965xPUB710
local.identifier.citationvolume11
local.identifier.scopusID2-s2.0-78649418936
local.identifier.thomsonID000282523400001
local.type.statusPublished Version

Downloads

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
01_Reid_Composite_binary_losses_2010.pdf
Size:
689.58 KB
Format:
Adobe Portable Document Format