A scaled Bregman theorem with applications
Bregman divergences play a central role in the design and analysis of a range of machine learning algorithms through a handful of popular theorems. We present a new theorem which shows that "Bregman distortions" (employing a potentially non-convex generator) may be exactly re-written as a scaled Bregman divergence computed over transformed data. This property can be viewed from the standpoints of geometry (a scaled isometry with adaptive metrics) or convex optimization (relating generalized...[Show more]
|Collections||ANU Research Publications|
|Source:||Advances in Neural Information Processing Systems|
|Access Rights:||Open Access|
|01_Nock_A_scaled_Bregman_theorem_with_2016.pdf||1.07 MB||Adobe PDF|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.