A scaled Bregman theorem with applications
Nock, Richard; Menon, Aditya; Ong, Cheng Soon
Description
Bregman divergences play a central role in the design and analysis of a range of machine learning algorithms through a handful of popular theorems. We present a new theorem which shows that "Bregman distortions" (employing a potentially non-convex generator) may be exactly re-written as a scaled Bregman divergence computed over transformed data. This property can be viewed from the standpoints of geometry (a scaled isometry with adaptive metrics) or convex optimization (relating generalized...[Show more]
Collections | ANU Research Publications |
---|---|
Date published: | 2016 |
Type: | Conference paper |
URI: | http://hdl.handle.net/1885/154047 |
Source: | Advances in Neural Information Processing Systems |
Access Rights: | Open Access |
Download
File | Description | Size | Format | Image |
---|---|---|---|---|
01_Nock_A_scaled_Bregman_theorem_with_2016.pdf | 1.07 MB | Adobe PDF | ![]() |
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.
Updated: 19 May 2020/ Responsible Officer: University Librarian/ Page Contact: Library Systems & Web Coordinator