Skip navigation
Skip navigation

A scaled Bregman theorem with applications

Nock, Richard; Menon, Aditya; Ong, Cheng Soon


Bregman divergences play a central role in the design and analysis of a range of machine learning algorithms through a handful of popular theorems. We present a new theorem which shows that "Bregman distortions" (employing a potentially non-convex generator) may be exactly re-written as a scaled Bregman divergence computed over transformed data. This property can be viewed from the standpoints of geometry (a scaled isometry with adaptive metrics) or convex optimization (relating generalized...[Show more]

CollectionsANU Research Publications
Date published: 2016
Type: Conference paper
Source: Advances in Neural Information Processing Systems
Access Rights: Open Access


File Description SizeFormat Image
01_Nock_A_scaled_Bregman_theorem_with_2016.pdf1.07 MBAdobe PDFThumbnail

Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  19 May 2020/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator