Open Research will be unavailable from 10.15am - 11am on Saturday 14th March 2026 AEDT due to scheduled maintenance.
 

Learning Discriminative alpha beta-Divergences for Positive Definite Matrices

Date

Authors

Cherian, Anoop
Stanitsas, Panagiotis
Harandi, Mehrtash
Morellas, Vassilios
Papanikolopoulos, Nikolaos

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Abstract

Symmetric positive definite (SPD) matrices are useful for capturing second-order statistics of visual data. To compare two SPD matrices, several measures are available, such as the affine-invariant Riemannian metric, Jeffreys divergence, Jensen-Bregman logdet divergence, etc.; however, their behaviors may be application dependent, raising the need of manual selection to achieve the best possible performance. Further and as a result of their overwhelming complexity for large-scale problems, computing pairwise similarities by clever embedding of SPD matrices is often preferred to direct use of the aforementioned measures. In this paper, we propose a discriminative metric learning framework, Information Divergence and Dictionary Learning (IDDL), that not only learns application specific measures on SPD matrices automatically, but also embeds them as vectors using a learned dictionary. To learn the similarity measures (which could potentially be distinct for every dictionary atom), we use the recently introduced alpha beta-logdet divergence, which is known to unify the measures listed above. We propose a novel IDDL objective, that learns the parameters of the divergence and the dictionary atoms jointly in a discriminative setup and is solved efficiently using Riemannian optimization. We showcase extensive experiments on eight computer vision datasets, demonstrating state-of-the-art performances.

Description

Keywords

Citation

Source

Proceedings of the IEEE International Conference on Computer Vision

Book Title

Entity type

Access Statement

Open Access

License Rights

Restricted until

Downloads

File
Description