Multi-scale local-global architecture for person re-identification

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorLiu, Jingen_US
dc.contributor.authorTiwari, Prayagen_US
dc.contributor.authorNguyen, Tri Giaen_US
dc.contributor.authorGupta, Deepaken_US
dc.contributor.authorBand, Shahab S.en_US
dc.contributor.departmentDepartment of Computer Scienceen
dc.contributor.groupauthorProfessorship Marttinen P.en
dc.contributor.organizationNorthwestern Polytechnical Universityen_US
dc.contributor.organizationFPT Universityen_US
dc.contributor.organizationMaharaja Agrasen Institute of Technologyen_US
dc.contributor.organizationNational Yunlin University of Science and Technologyen_US
dc.date.accessioned2022-03-16T12:52:14Z
dc.date.available2022-03-16T12:52:14Z
dc.date.issued2022-08en_US
dc.description| openaire: EC/H2020/101016775/EU//INTERVENE Funding Information: Open Access funding provided by Aalto University. This work was supported by the Academy of Finland (Grants 336033, 315896), Business Finland (Grant 884/31/2018), and EU H2020 (Grant 101016775). Publisher Copyright: © 2022, The Author(s).
dc.description.abstractWith the emergence of deep learning method, which has been driven a great success for the field of person re-identification (re-ID). However, the existing works mainly focus on first-order attention (i.e., spatial and channels attention) statistics to model the valuable information for person re-ID. On the other hand, most existing methods operate data points respectively, which ignores discriminative patterns to some extent. In this paper, we present an automated framework named multi-scale local-global for person re-ID. The framework consists of two components. The first component is that a high-order attention module is adopted to learn high-order attention patterns to model the subtle differences among pedestrians and to generate the informative attention features. On the other hand, a novel architecture named spectral feature transformation is designed to make for the optimization of group wise similarities. Furthermore, we fuse the components together to form an ensemble model for person re-ID. Extensive experiments were conducted on the three benchmark datasets, i.e., Market-1501, DukeMTMC-reID, CUHK03, showing the superiority of the proposed method.en
dc.description.versionPeer revieweden
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationLiu, J, Tiwari, P, Nguyen, T G, Gupta, D & Band, S S 2022, 'Multi-scale local-global architecture for person re-identification', Soft Computing, vol. 26, no. 16, pp. 7967-7977. https://doi.org/10.1007/s00500-022-06859-6en
dc.identifier.doi10.1007/s00500-022-06859-6en_US
dc.identifier.issn1432-7643
dc.identifier.issn1433-7479
dc.identifier.otherPURE UUID: c8e9ab5b-c1b0-4df0-97ec-60627beaf5d7en_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/c8e9ab5b-c1b0-4df0-97ec-60627beaf5d7en_US
dc.identifier.otherPURE LINK: http://www.scopus.com/inward/record.url?scp=85125526072&partnerID=8YFLogxK
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/85804747/Multi_scale_local_global_architecture_for_person_re_identification.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/113389
dc.identifier.urnURN:NBN:fi:aalto-202203162268
dc.language.isoenen
dc.publisherSpringer
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/101016775/EU//INTERVENE Funding Information: Open Access funding provided by Aalto University. This work was supported by the Academy of Finland (Grants 336033, 315896), Business Finland (Grant 884/31/2018), and EU H2020 (Grant 101016775). Publisher Copyright: © 2022, The Author(s).en_US
dc.relation.ispartofseriesSoft Computingen
dc.relation.ispartofseriesVolume 26, issue 16, pp. 7967-7977en
dc.rightsopenAccessen
dc.subject.keywordAttention mechanismen_US
dc.subject.keywordDeep learningen_US
dc.subject.keywordMulti-scale local-global architectureen_US
dc.subject.keywordPerson re-identificationen_US
dc.titleMulti-scale local-global architecture for person re-identificationen
dc.typeA1 Alkuperäisartikkeli tieteellisessä aikakauslehdessäfi
dc.type.versionpublishedVersion

Files