Learning with Non-Positive Kernels
Date
2004
Authors
Ong, Cheng Song
Mary, Xavier
Canu, Stephane
Smola, Alexander
Journal Title
Journal ISSN
Volume Title
Publisher
Association for Computing Machinery Inc (ACM)
Abstract
In this paper we show that many kernel methods can be adapted to deal with indefinite kernels, that is, kernels which are not positive semidefinite. They do not satisfy Mercer's condition and they induce associated functional spaces called Reproducing Kernel Kreǐn Spaces (RKKS), a generalization of Reproducing Kernel Hubert Spaces (RKHS). Machine learning in RKKS shares many "nice" properties of learning in RKHS, such as orthogonality and projection. However, since the kernels are indefinite, we can no longer minimize the loss, instead we stabilize it. We show a general representer theorem for constrained stabilization and prove generalization bounds by computing the Rademacher averages of the kernel class. We list several examples of indefinite kernels and investigate regularization methods to solve spline interpolation. Some preliminary experiments with indefinite kernels for spline smoothing are reported for truncated spectral factorization, Landweber-Fridman iterations, and MR-II.
Description
Keywords
Keywords: Approximation theory; Costs; Eigenvalues and eigenfunctions; Iterative methods; Matrix algebra; Problem solving; Set theory; Spurious signal noise; Hilbert spaces; Matrix-vector products; Reproducing kernel Hilbert spaces; Reproducing Kernel Krein Spaces Ill-posed Problems; Indefinite Kernels; Non-convex Optimization; Rademacher Average; Representer Theorem; Reproducing Kernel Krein Space
Citation
Collections
Source
Proceedings of 21st International Conference on Machine Learning (ICML-2004)
Type
Conference paper
Book Title
Entity type
Access Statement
License Rights
DOI
Restricted until
2037-12-31
Downloads
File
Description