Step size adaptation in reproducing kernel Hilbert space
Date
Authors
Vishwanathan, S
Schraudolph, Nicol
Smola, Alexander
Journal Title
Journal ISSN
Volume Title
Publisher
MIT Press
Abstract
This paper presents an online support vector machine (SVM) that uses the stochastic meta-descent
(SMD) algorithm to adapt its step size automatically. We formulate the online learning problem as
a stochastic gradient descent in reproducing kernel Hilbert space (RKHS) and translate SMD to the
nonparametric setting, where its gradient trace parameter is no longer a coefficient vector but an
element of the RKHS. We derive efficient updates that allow us to perform the step size adaptation
in linear time. We apply the online SVM framework to a variety of loss functions, and in particular
show how to handle structured output spaces and achieve efficient online multiclass classification.
Experiments show that our algorithm outperforms more primitive methods for setting the gradient
step size.
Description
Citation
Journal of Machine Learning Research 7 (2006): 1107-1133
Collections
Source
Journal of Machine Learning Research
Type
Book Title
Entity type
Access Statement
License Rights
DOI
Restricted until
Downloads
File
Description