Skip navigation
Skip navigation

Online learning with kernels

Kivinen, Jyrki; Smola, Alexander; Williamson, Robert

Description

Kernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient...[Show more]

dc.contributor.authorKivinen, Jyrki
dc.contributor.authorSmola, Alexander
dc.contributor.authorWilliamson, Robert
dc.date.accessioned2015-12-13T22:50:24Z
dc.identifier.issn1053-587X
dc.identifier.urihttp://hdl.handle.net/1885/80760
dc.description.abstractKernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient descent within a feature space and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst-case loss bounds, and moreover, we show the convergence of the hypothesis to the minimizer of the regularized risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection.
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE Inc)
dc.sourceIEEE Transactions on Signal Processing
dc.subjectKeywords: Computational methods; Convergence of numerical methods; Functions; Learning systems; Neural networks; Optimization; Random processes; Regression analysis; Theorem proving; Kernel based algorithms; Large margin classifiers; Novelty detection; Online learn
dc.titleOnline learning with kernels
dc.typeJournal article
local.description.notesImported from ARIES
local.description.refereedYes
local.identifier.citationvolume52
dc.date.issued2004
local.identifier.absfor080109 - Pattern Recognition and Data Mining
local.identifier.ariespublicationMigratedxPub9026
local.type.statusPublished Version
local.contributor.affiliationKivinen, Jyrki, University of Helsinki
local.contributor.affiliationSmola, Alexander, College of Engineering and Computer Science, ANU
local.contributor.affiliationWilliamson, Robert, College of Engineering and Computer Science, ANU
local.description.embargo2037-12-31
local.bibliographicCitation.issue8
local.bibliographicCitation.startpage2165
local.bibliographicCitation.lastpage2176
local.identifier.doi10.1109/TSP.2004.830991
dc.date.updated2015-12-11T10:40:14Z
local.identifier.scopusID2-s2.0-3543110224
CollectionsANU Research Publications

Download

File Description SizeFormat Image
01_Kivinen_Online_learning_with_ke_2004.pdf516.83 kBAdobe PDF    Request a copy


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  19 May 2020/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator