A Scalable Algorithm for Learning a Mahalanobis Distance Metric
Date
2009
Authors
Kim, Junae
Shen, Chunhua
Wang, Lei
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Abstract
A distance metric that can accurately reflect the intrinsic characteristics of data is critical for visual recognition tasks. An effective solution to defining such a metric is to learn it from a set of training samples. In this work, we propose a fast and scalable algorithm to learn a Mahalanobis distance. By employing the principle of margin maximization to secure better generalization performances, this algorithm formulates the metric learning as a convex optimization problem with a positive semidefinite (psd) matrix variable. Based on an important theorem that a psd matrix with trace of one can always be represented as a convex combination of multiple rank-one matrices, our algorithm employs a differentiable loss function and solves the above convex optimization with gradient descent methods. This algorithm not only naturally maintains the psd requirement of the matrix variable that is essential for metric learning, but also significantly cuts down computational overhead, making it much more e.cient with the increasing dimensions of feature vectors. Experimental study on benchmark data sets indicates that, compared with the existing metric learning algorithms, our algorithm can achieve higher classification accuracy with much less computational load.
Description
Keywords
Keywords: Benchmark data; Classification accuracy; Computational loads; Computational overheads; Convex combinations; Convex optimization problems; Distance metrics; Effective solution; Experimental studies; Feature vectors; Generalization performance; Gradient Des
Citation
Collections
Source
Proceedings of Asian Conference on Computer Vision (ACCV 2009)
Type
Conference paper
Book Title
Entity type
Access Statement
License Rights
Restricted until
2037-12-31
Downloads
File
Description