Skip navigation
Skip navigation

Robust Tracking using Manifold Convolutional Neural Networks with Laplacian Regularization

Hu, Hongwei; Ma, Bo; Shen, Jianbing; Hanqiu, Sun; Shao, Ling; Porikli, Fatih

Description

In visual tracking, usually only a small number of samples are labeled, and most existing deep learning based trackers ignore abundant unlabeled samples that could provide additional information for deep trackers to boost their tracking performance. An intuitive way to explain unlabeled data is to incorporate manifold regularization into the common classification loss functions, but the high computational cost may prohibit those deep trackers from practical applications. To overcome this issue,...[Show more]

dc.contributor.authorHu, Hongwei
dc.contributor.authorMa, Bo
dc.contributor.authorShen, Jianbing
dc.contributor.authorHanqiu, Sun
dc.contributor.authorShao, Ling
dc.contributor.authorPorikli, Fatih
dc.date.accessioned2020-09-14T00:02:32Z
dc.date.available2020-09-14T00:02:32Z
dc.identifier.issn1520-9210
dc.identifier.urihttp://hdl.handle.net/1885/209992
dc.description.abstractIn visual tracking, usually only a small number of samples are labeled, and most existing deep learning based trackers ignore abundant unlabeled samples that could provide additional information for deep trackers to boost their tracking performance. An intuitive way to explain unlabeled data is to incorporate manifold regularization into the common classification loss functions, but the high computational cost may prohibit those deep trackers from practical applications. To overcome this issue, we propose a two-stage approach to a deep tracker that takes into account both labeled and unlabeled samples. The annotation of unlabeled samples is propagated from its labeled neighbors first by exploring the manifold space that these samples are assumed to lie in. Then, we refine it by training a deep convolutional neural network (CNN) using both labeled and unlabeled data in a supervised manner. Online visual tracking is further carried out under the framework of particle filters with the presented manifold regularized deep model being updated every few frames. Experimental results on different public tracking datasets demonstrate that our tracker outperforms most existing visual tracking approaches.
dc.format.mimetypeapplication/pdf
dc.language.isoen_AU
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE Inc)
dc.rights© 2018 IEEE
dc.sourceIEEE Transactions on Multimedia
dc.titleRobust Tracking using Manifold Convolutional Neural Networks with Laplacian Regularization
dc.typeJournal article
local.description.notesImported from ARIES
local.identifier.citationvolume21
dc.date.issued2018
local.identifier.absfor080104 - Computer Vision
local.identifier.ariespublicationa383154xPUB10473
local.publisher.urlhttps://www.ieee.org/
local.type.statusAccepted Version
local.contributor.affiliationHu, Hongwei, Beijing Lab of Intelligent Information Technology
local.contributor.affiliationMa, Bo, Beijing Institute of Technology
local.contributor.affiliationShen, Jianbing, Beijing Lab of Intelligent Information Technology
local.contributor.affiliationHanqiu, Sun, Chinese University of Hong Kong
local.contributor.affiliationShao, Ling, University of East Anglia
local.contributor.affiliationPorikli, Fatih, College of Engineering and Computer Science, ANU
local.bibliographicCitation.issue2
local.bibliographicCitation.startpage510
local.bibliographicCitation.lastpage521
local.identifier.doi10.1109/TMM.2018.2859831
local.identifier.absseo899999 - Information and Communication Services not elsewhere classified
dc.date.updated2020-06-23T00:52:22Z
local.identifier.scopusID2-s2.0-85050596820
dcterms.accessRightsOpen Access
dc.provenancehttps://v2.sherpa.ac.uk/id/publication/3527..."The Accepted Version can be archived in an Institutional Repository" from SHERPA/RoMEO site (as at 14/09/2020).
CollectionsANU Research Publications

Download

File Description SizeFormat Image
01_Hu_Robust_Tracking_using_Manifold_2018.pdf4.63 MBAdobe PDFThumbnail


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  17 November 2022/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator