Skip navigation
Skip navigation

Learning an invariant Hilbert space for domain adaptation

Herath, Samitha; Harandi, Mehrtash; Porikli, Fatih

Description

This paper introduces a learning scheme to construct a Hilbert space (i.e., a vector space along its inner product) to address both unsupervised and semi-supervised domain adaptation problems. This is achieved by learning projections from each domain to a latent space along the Mahalanobis metric of the latent space to simultaneously minimizing a notion of domain variance while maximizing a measure of discriminatory power. In particular, we make use of the Riemannian optimization techniques to...[Show more]

dc.contributor.authorHerath, Samitha
dc.contributor.authorHarandi, Mehrtash
dc.contributor.authorPorikli, Fatih
dc.contributor.editorO'Conner, Lisa
dc.coverage.spatialHonolulu USA
dc.date.accessioned2020-09-14T04:33:02Z
dc.date.createdJuly 21-26 2017
dc.identifier.isbn9781538604571
dc.identifier.urihttp://hdl.handle.net/1885/210273
dc.description.abstractThis paper introduces a learning scheme to construct a Hilbert space (i.e., a vector space along its inner product) to address both unsupervised and semi-supervised domain adaptation problems. This is achieved by learning projections from each domain to a latent space along the Mahalanobis metric of the latent space to simultaneously minimizing a notion of domain variance while maximizing a measure of discriminatory power. In particular, we make use of the Riemannian optimization techniques to match statistical properties (e.g., first and second order statistics) between samples projected into the latent space from different domains. Upon availability of class labels, we further deem samples sharing the same label to form more compact clusters while pulling away samples coming from different classes. We extensively evaluate and contrast our proposal against state-of-the-art methods for the task of visual domain adaptation using both handcrafted and deep-net features. Our experiments show that even with a simple nearest neighbor classifier, the proposed method can outperform several state-of-the-art methods benefitting from more involved classification schemes.
dc.format.mimetypeapplication/pdf
dc.language.isoen_AU
dc.publisherIEEE
dc.relation.ispartof30th IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2017
dc.rights© 2017 IEEE
dc.sourceProceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2017
dc.titleLearning an invariant Hilbert space for domain adaptation
dc.typeConference paper
local.description.notesImported from ARIES
local.description.refereedYes
dc.date.issued2017
local.identifier.absfor080104 - Computer Vision
local.identifier.ariespublicationa383154xPUB9099
local.publisher.urlhttp://www.ieee.org/index.html
local.type.statusPublished Version
local.contributor.affiliationHerath, Samitha, College of Engineering and Computer Science, ANU
local.contributor.affiliationHarandi, Mehrtash, College of Engineering and Computer Science, ANU
local.contributor.affiliationPorikli, Fatih, College of Engineering and Computer Science, ANU
local.description.embargo2037-12-31
local.bibliographicCitation.startpage3956
local.bibliographicCitation.lastpage3965
local.identifier.doi10.1109/CVPR.2017.421
local.identifier.absseo899999 - Information and Communication Services not elsewhere classified
dc.date.updated2020-06-23T00:53:06Z
local.identifier.scopusID2-s2.0-85041925749
CollectionsANU Research Publications

Download

File Description SizeFormat Image
01_Herath_Learning_an_invariant_Hilbert_2017.pdf742.87 kBAdobe PDFThumbnail


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  17 November 2022/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator