Pedestrian alignment network for large-scale person re-identification

dc.contributor.authorZheng, Zhedong
dc.contributor.authorZheng, Liang
dc.contributor.authorYang, Yi
dc.date.accessioned2023-12-11T22:39:29Z
dc.date.issued2019
dc.date.updated2022-09-04T08:17:51Z
dc.description.abstractPerson re-identification (re-ID) is mostly viewed as an image retrieval problem. This task aims to search a query person in a large image pool. In practice, person re-ID usually adopts automatic detectors to obtain cropped pedestrian images. However, this process suffers from two types of detector errors: excessive background and part missing. Both errors deteriorate the quality of pedestrian alignment and may compromise pedestrian matching due to the position and scale variances. To address the misalignment problem, we propose that alignment be learned from an identification procedure. We introduce the pedestrian alignment network (PAN) which allows discriminative embedding learning pedestrian alignment without extra annotations. We observe that when the convolutional neural network learns to discriminate between different identities, the learned feature maps usually exhibit strong activations on the human body rather than the background. The proposed network thus takes advantage of this attention mechanism to adaptively locate and align pedestrians within a bounding box. Visual examples show that pedestrians are better aligned with PAN. Experiments on three large-scale re-ID datasets confirm that PAN improves the discriminative ability of the feature embeddings and yields competitive accuracy with the state-of-the-art methods.en_AU
dc.format.mimetypeapplication/pdfen_AU
dc.identifier.issn1051-8215en_AU
dc.identifier.urihttp://hdl.handle.net/1885/309786
dc.language.isoen_AUen_AU
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE Inc)en_AU
dc.rights© 2019 The authorsen_AU
dc.sourceIEEE Transactions on Circuits and Systems for Video Technologyen_AU
dc.subjectPerson re-identificationen_AU
dc.subjectperson searchen_AU
dc.subjectperson alignmenten_AU
dc.subjectimage retrievalen_AU
dc.subjectdeep learningen_AU
dc.titlePedestrian alignment network for large-scale person re-identificationen_AU
dc.typeJournal articleen_AU
local.bibliographicCitation.issue10en_AU
local.bibliographicCitation.lastpage3045en_AU
local.bibliographicCitation.startpage3037en_AU
local.contributor.affiliationZheng, Zhedong, University of Technology Sydneyen_AU
local.contributor.affiliationZheng, Liang, College of Engineering and Computer Science, ANUen_AU
local.contributor.affiliationYang, Yi, University of Technology Sydneyen_AU
local.contributor.authoremailu1064892@anu.edu.auen_AU
local.contributor.authoruidZheng, Liang, u1064892en_AU
local.description.embargo2099-12-31
local.description.notesImported from ARIESen_AU
local.identifier.absfor461103 - Deep learningen_AU
local.identifier.absfor460304 - Computer visionen_AU
local.identifier.ariespublicationu5786633xPUB952en_AU
local.identifier.citationvolume29en_AU
local.identifier.doi10.1109/TCSVT.2018.2873599en_AU
local.identifier.scopusID2-s2.0-85054525203
local.identifier.thomsonIDWOS:000489749900014
local.identifier.uidSubmittedByu5786633en_AU
local.publisher.urlhttps://ieeexplore.ieee.org/en_AU
local.type.statusPublished Versionen_AU

Downloads

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Pedestrian_Alignment_Network_for_Large-scale_Person_Re-Identification.pdf
Size:
3.96 MB
Format:
Adobe Portable Document Format
Description:
Back to topicon-arrow-up-solid
 
APRU
IARU
 
edX
Group of Eight Member

Acknowledgement of Country

The Australian National University acknowledges, celebrates and pays our respects to the Ngunnawal and Ngambri people of the Canberra region and to all First Nations Australians on whose traditional lands we meet and work, and whose cultures are among the oldest continuing cultures in human history.


Contact ANUCopyrightDisclaimerPrivacyFreedom of Information

+61 2 6125 5111 The Australian National University, Canberra

TEQSA Provider ID: PRV12002 (Australian University) CRICOS Provider Code: 00120C ABN: 52 234 063 906