Local Rademacher complexities
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
|Collections||ANU Research Publications|
|Source:||The Annals of Statistics|
|01_Bartlett_Local_Rademacher_c_2005.pdf||Published Version||403.82 kB||Adobe PDF|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.