Feature Refinement and Filter Network for Person Re-Identification
In the task of person re-identification, the attention mechanism and fine-grained information have been proved to be effective. However, it has been observed that models often focus on the extraction of features with strong discrimination, and neglect other valuable features. The extracted fine-grai...
Saved in:
Published in | IEEE transactions on circuits and systems for video technology Vol. 31; no. 9; pp. 3391 - 3402 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.09.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | In the task of person re-identification, the attention mechanism and fine-grained information have been proved to be effective. However, it has been observed that models often focus on the extraction of features with strong discrimination, and neglect other valuable features. The extracted fine-grained information may include redundancies. In addition, current methods lack an effective scheme to remove background interference. Therefore, this paper proposes the feature refinement and filter network to solve the above problems from three aspects: first, by weakening the high response features, we aim to identify highly valuable features and extract the complete features of persons, thereby enhancing the robustness of the model; second, by positioning and intercepting the high response areas of persons, we eliminate the interference arising from background information and strengthen the response of the model to the complete features of persons; finally, valuable fine-grained features are selected using a multi-branch attention network for person re-identification to enhance the performance of the model. Our extensive experiments on the benchmark Market-1501, DukeMTMC-reID, CUHK03 and MSMT17 person re-identification datasets demonstrate that the performance of our method is comparable to that of state-of-the-art approaches. |
---|---|
AbstractList | In the task of person re-identification, the attention mechanism and fine-grained information have been proved to be effective. However, it has been observed that models often focus on the extraction of features with strong discrimination, and neglect other valuable features. The extracted fine-grained information may include redundancies. In addition, current methods lack an effective scheme to remove background interference. Therefore, this paper proposes the feature refinement and filter network to solve the above problems from three aspects: first, by weakening the high response features, we aim to identify highly valuable features and extract the complete features of persons, thereby enhancing the robustness of the model; second, by positioning and intercepting the high response areas of persons, we eliminate the interference arising from background information and strengthen the response of the model to the complete features of persons; finally, valuable fine-grained features are selected using a multi-branch attention network for person re-identification to enhance the performance of the model. Our extensive experiments on the benchmark Market-1501, DukeMTMC-reID, CUHK03 and MSMT17 person re-identification datasets demonstrate that the performance of our method is comparable to that of state-of-the-art approaches. |
Author | Li, Weijun Gong, Ke Bai, Xiao Ning, Xin Zhang, Liping Tian, Shengwei |
Author_xml | – sequence: 1 givenname: Xin orcidid: 0000-0001-7897-1673 surname: Ning fullname: Ning, Xin email: ningxin@semi.ac.cn organization: Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China – sequence: 2 givenname: Ke orcidid: 0000-0002-4767-5916 surname: Gong fullname: Gong, Ke email: gongke@wavewisdom-bj.com organization: Cognitive Computing Technology Joint Laboratory, Wave Group, Beijing, China – sequence: 3 givenname: Weijun orcidid: 0000-0001-9668-2883 surname: Li fullname: Li, Weijun email: wjli@semi.ac.cn organization: Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China – sequence: 4 givenname: Liping orcidid: 0000-0001-6508-3757 surname: Zhang fullname: Zhang, Liping email: zliping@semi.ac.cn organization: Institute of Semiconductors, Chinese Academy of Sciences, Beijing, China – sequence: 5 givenname: Xiao surname: Bai fullname: Bai, Xiao email: baixiao@buaa.edu.cn organization: State Key Laboratory of Software Development Environment, School of Computer Science and Engineering, Jiangxi Research Institute, Beihang University, Beijing, China – sequence: 6 givenname: Shengwei orcidid: 0000-0003-3525-5102 surname: Tian fullname: Tian, Shengwei organization: School of Software, Xinjiang University, Xinjiang, China |
BookMark | eNp9kE1PAjEQhhujiYj-Ab1s4nmx04_d7lGJKAlRo-i1Kcs0KUKL3RLjv7eA8eDB0_TwPp13nhNy6INHQs6BDgBoczUdvrxNB4wyOuBUcMqqA9IDKVXJGJWH-U0llIqBPCYnXbegFIQSdY_cjNCkTcTiGa3zuEKfCuPnxcgtE8biAdNniO-FDbF4wtgFn4PleJ5jzrrWJBf8KTmyZtnh2c_sk9fR7XR4X04e78bD60nZskamkqOqwDYG57mWFIKxSkFuMVMWbF3NuIFZbQRyhdiIViol8gG1bQHBMil5n1zu_13H8LHBLulF2ESfV2omK1XXNac0p9Q-1cbQdRGtbl3a9UzRuKUGqrfG9M6Y3hrTP8Yyyv6g6-hWJn79D13sIYeIv0DDlOTA-DcOpXfR |
CODEN | ITCTEM |
CitedBy_id | crossref_primary_10_1109_TCSVT_2024_3399406 crossref_primary_10_1007_s11042_022_12155_0 crossref_primary_10_1155_2022_2227049 crossref_primary_10_1155_2022_6064262 crossref_primary_10_1109_TETCI_2024_3406411 crossref_primary_10_1007_s11042_023_15719_w crossref_primary_10_1109_TNSRE_2023_3283045 crossref_primary_10_1016_j_displa_2021_102094 crossref_primary_10_1155_2021_9934363 crossref_primary_10_1080_09540091_2021_2012423 crossref_primary_10_1007_s00521_023_08301_w crossref_primary_10_1016_j_bdr_2023_100409 crossref_primary_10_1016_j_imavis_2024_105038 crossref_primary_10_1155_2021_2348494 crossref_primary_10_1016_j_patcog_2023_109669 crossref_primary_10_3390_app13137611 crossref_primary_10_3390_math12162495 crossref_primary_10_3390_s21103476 crossref_primary_10_1049_ipr2_12688 crossref_primary_10_3390_s22093328 crossref_primary_10_1155_2021_5574152 crossref_primary_10_1109_ACCESS_2023_3243553 crossref_primary_10_1080_25765299_2023_2194122 crossref_primary_10_1109_TCSVT_2021_3082775 crossref_primary_10_1155_2022_8011124 crossref_primary_10_1002_cpe_6885 crossref_primary_10_1007_s00521_022_07340_z crossref_primary_10_1016_j_jvcir_2023_103772 crossref_primary_10_1007_s11042_022_12856_6 crossref_primary_10_1007_s11042_024_18440_4 crossref_primary_10_1109_TMM_2024_3355644 crossref_primary_10_1007_s11042_023_15546_z crossref_primary_10_1007_s11042_022_12276_6 crossref_primary_10_1109_TCSS_2024_3403691 crossref_primary_10_1109_TCSVT_2022_3216769 crossref_primary_10_1155_2022_7104750 crossref_primary_10_1007_s11227_024_06731_4 crossref_primary_10_1007_s12145_024_01479_0 crossref_primary_10_1109_ACCESS_2022_3223426 crossref_primary_10_1145_3649900 crossref_primary_10_3390_app112210809 crossref_primary_10_1080_09540091_2021_2024510 crossref_primary_10_1007_s11277_022_09790_z crossref_primary_10_1007_s11280_024_01298_9 crossref_primary_10_1007_s00500_023_08326_2 crossref_primary_10_1155_2022_5776341 crossref_primary_10_1155_2021_7510641 crossref_primary_10_1111_exsy_13675 crossref_primary_10_1016_j_displa_2022_102192 crossref_primary_10_1109_TCSVT_2024_3379577 crossref_primary_10_1007_s00371_024_03516_x crossref_primary_10_1109_TCSVT_2022_3144775 crossref_primary_10_1007_s11263_025_02407_5 crossref_primary_10_1007_s41095_024_0424_2 crossref_primary_10_3390_s24092795 crossref_primary_10_1109_TCSVT_2022_3188551 crossref_primary_10_1145_3628452 crossref_primary_10_1080_09540091_2023_2286186 crossref_primary_10_3390_app13074453 crossref_primary_10_7717_peerj_cs_1098 crossref_primary_10_1016_j_cviu_2023_103749 crossref_primary_10_1109_TCSVT_2023_3273719 crossref_primary_10_1007_s00530_024_01638_9 crossref_primary_10_1007_s11042_023_14369_2 crossref_primary_10_1109_ACCESS_2024_3428438 crossref_primary_10_1007_s11042_022_13152_z crossref_primary_10_1109_TCSVT_2021_3118060 crossref_primary_10_1109_TCSVT_2024_3382322 crossref_primary_10_1155_2022_4509434 crossref_primary_10_1155_2021_2602385 crossref_primary_10_1002_cpe_6704 crossref_primary_10_1016_j_patcog_2025_111591 crossref_primary_10_3390_s21082842 crossref_primary_10_1155_2022_7253832 crossref_primary_10_1109_TCSVT_2023_3340346 crossref_primary_10_1109_TCSVT_2023_3287300 crossref_primary_10_1007_s11042_022_14016_2 crossref_primary_10_1007_s11042_023_17665_z crossref_primary_10_1002_cpe_6850 crossref_primary_10_1007_s11063_023_11294_1 crossref_primary_10_1002_cpe_6331 crossref_primary_10_1002_cpe_7144 crossref_primary_10_1007_s11042_021_11667_5 crossref_primary_10_1016_j_neunet_2023_11_003 crossref_primary_10_1016_j_engappai_2023_106200 crossref_primary_10_1016_j_displa_2021_102050 crossref_primary_10_1002_for_3265 crossref_primary_10_3389_fnbot_2022_1006755 crossref_primary_10_1109_TCSVT_2023_3310015 crossref_primary_10_1007_s11042_022_12055_3 crossref_primary_10_1155_2022_4502430 crossref_primary_10_1016_j_imavis_2023_104786 crossref_primary_10_1109_TCSVT_2022_3225285 crossref_primary_10_1016_j_imavis_2024_105111 crossref_primary_10_32604_cmc_2024_050384 crossref_primary_10_1155_2021_6971906 crossref_primary_10_1155_2022_3496810 crossref_primary_10_1016_j_image_2022_116783 crossref_primary_10_1109_TMM_2024_3521843 crossref_primary_10_1155_2021_3370580 crossref_primary_10_1016_j_compeleceng_2023_108755 crossref_primary_10_1049_cvi2_12107 crossref_primary_10_1109_JSEN_2021_3130951 crossref_primary_10_1155_2022_7781369 crossref_primary_10_1007_s00521_021_06027_1 crossref_primary_10_32604_cmes_2022_020857 crossref_primary_10_1109_TCSVT_2023_3338813 crossref_primary_10_3389_fnins_2023_1226154 crossref_primary_10_1109_JIOT_2024_3450675 crossref_primary_10_3390_app12157467 crossref_primary_10_1016_j_displa_2021_102107 crossref_primary_10_1016_j_inffus_2024_102438 crossref_primary_10_1109_JSTSP_2023_3260627 crossref_primary_10_1007_s00521_021_06151_y crossref_primary_10_1080_09540091_2022_2139352 crossref_primary_10_1109_TITS_2023_3316068 crossref_primary_10_1007_s11042_023_15718_x crossref_primary_10_1016_j_dsp_2024_104826 crossref_primary_10_1002_cpe_6873 crossref_primary_10_1109_ACCESS_2023_3346940 crossref_primary_10_1007_s11760_025_03983_2 crossref_primary_10_1109_TCSVT_2022_3194084 crossref_primary_10_1002_cpe_7043 crossref_primary_10_1016_j_compbiomed_2024_107955 crossref_primary_10_1109_TCSVT_2022_3213680 crossref_primary_10_1109_TCSVT_2024_3392831 crossref_primary_10_1007_s11042_022_12250_2 crossref_primary_10_1007_s11042_022_12078_w crossref_primary_10_1109_JSEN_2022_3181238 crossref_primary_10_1109_TCSVT_2022_3173263 crossref_primary_10_1007_s11042_022_12171_0 crossref_primary_10_1007_s11760_024_03695_z crossref_primary_10_1109_TCSVT_2023_3309647 crossref_primary_10_1016_j_inffus_2023_102128 crossref_primary_10_1109_TCSVT_2023_3261898 crossref_primary_10_3390_a14120361 crossref_primary_10_1016_j_displa_2023_102567 crossref_primary_10_1155_2021_1651560 crossref_primary_10_1080_09540091_2021_2015748 crossref_primary_10_1016_j_displa_2022_102149 crossref_primary_10_1109_TCSVT_2024_3376373 crossref_primary_10_1016_j_neucom_2024_128011 crossref_primary_10_1007_s12559_023_10145_4 crossref_primary_10_1109_ACCESS_2023_3302512 crossref_primary_10_1109_TCSVT_2023_3285046 crossref_primary_10_1155_2022_6567123 |
Cites_doi | 10.1109/CVPR.2018.00739 10.1109/ICCV.2019.00386 10.1109/TIP.2019.2896952 10.1109/ICCV.2017.410 10.1016/j.imavis.2020.103875 10.1109/CVPR.2018.00902 10.1007/978-3-030-01225-0_30 10.1109/CVPR.2019.00871 10.1007/978-3-030-01240-3_15 10.1109/TCSVT.2019.2957467 10.1109/CVPR.2018.00839 10.1109/CVPR.2018.00720 10.1109/CVPR.2018.00051 10.1109/AUTOID.2005.48 10.1109/CVPR.2018.00016 10.1109/CVPR42600.2020.01409 10.1109/CVPRW.2019.00190 10.1109/ICCV.2019.00379 10.1016/j.jvcir.2018.11.044 10.1109/ICCV.2019.00385 10.1109/CVPR.2019.00730 10.1109/ICIP.2019.8803244 10.1109/TCSVT.2018.2873599 10.1109/CVPR.2018.00117 10.1109/ICCV.2019.00046 10.1109/CVPR.2019.00148 10.1109/ICIP.2019.8803292 10.1109/CVPR.2018.00562 10.1109/ICCV.2015.133 10.1145/3240508.3240552 10.1109/CVPR.2018.00243 10.1109/TPAMI.2018.2807450 10.1109/ICCV.2019.00380 10.1109/CVPR.2018.00046 10.1109/TIP.2020.2975712 10.1109/CVPR.2016.90 10.1109/CVPR.2015.7298594 10.1109/CVPR.2018.00225 10.1109/CVPR.2016.308 10.1609/aaai.v33i01.33018933 10.1109/ICCV.2019.00844 10.1109/CVPR.2018.00607 10.1109/CVPR.2018.00745 10.1109/CVPR.2014.27 10.1109/TETCI.2018.2883348 10.1109/CVPR.2019.00588 10.1007/978-3-030-60636-7_2 10.1109/ACCESS.2019.2929523 10.1016/j.neucom.2019.10.083 10.1109/CVPR.2018.00960 10.1109/CVPR.2019.01096 10.1109/CVPR.2018.00226 10.1007/978-3-030-01234-2_1 10.1109/CVPR.2019.00076 10.1109/ICCV.2017.381 10.1109/CVPR.2019.00954 10.1007/978-3-319-48881-3_2 10.5244/C.31.18 10.1109/ICIP.2019.8804419 10.1109/TIP.2018.2851098 10.1109/TCSVT.2019.2957539 10.1007/978-3-030-01225-0_23 10.1109/ICIP.2019.8803796 10.1609/aaai.v34i07.7000 10.1016/j.patcog.2019.107016 10.1109/CVPR.2018.00129 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/TCSVT.2020.3043026 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Technology Research Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1558-2205 |
EndPage | 3402 |
ExternalDocumentID | 10_1109_TCSVT_2020_3043026 9285312 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 61901436 funderid: 10.13039/501100001809 – fundername: Shenzhen Wave Kingdom Company Ltd |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 AAYXX CITATION RIG 7SC 7SP 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c295t-3e861f9aed55854422681148b8f1f76b3a1b7a4e38ee94c58841557fc1e1f2553 |
IEDL.DBID | RIE |
ISSN | 1051-8215 |
IngestDate | Mon Jun 30 04:00:18 EDT 2025 Tue Jul 01 00:41:14 EDT 2025 Thu Apr 24 23:08:23 EDT 2025 Wed Aug 27 02:27:33 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 9 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c295t-3e861f9aed55854422681148b8f1f76b3a1b7a4e38ee94c58841557fc1e1f2553 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-4767-5916 0000-0003-3525-5102 0000-0001-9668-2883 0000-0001-7897-1673 0000-0001-6508-3757 |
PQID | 2568777300 |
PQPubID | 85433 |
PageCount | 12 |
ParticipantIDs | ieee_primary_9285312 crossref_citationtrail_10_1109_TCSVT_2020_3043026 proquest_journals_2568777300 crossref_primary_10_1109_TCSVT_2020_3043026 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2021-09-01 |
PublicationDateYYYYMMDD | 2021-09-01 |
PublicationDate_xml | – month: 09 year: 2021 text: 2021-09-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on circuits and systems for video technology |
PublicationTitleAbbrev | TCSVT |
PublicationYear | 2021 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref13 ref56 ref12 ref59 ref58 ref14 ref53 ref52 hermans (ref42) 2017; abs 1703 0 ref55 ref11 zhu (ref23) 2019 ref54 ref10 ref17 ref16 ref19 ref18 ref51 zhang (ref66) 2020 ref50 ref46 ref45 ref48 ref47 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref3 ref6 ref5 ref40 devries (ref26) 2017 ref35 ref34 ref37 ref36 ref74 ref30 ref33 ref32 ref2 ref1 ref39 ref38 ref71 ref70 ref73 ref72 hu (ref27) 2019 zheng (ref15) 2016 ref68 ref24 ref67 ref69 ref25 ref64 ref20 ref63 ref22 ref65 ref21 ref28 ref29 simonyan (ref4) 2015 zheng (ref31) 2019 ref60 ref62 ref61 |
References_xml | – ident: ref16 doi: 10.1109/CVPR.2018.00739 – ident: ref54 doi: 10.1109/ICCV.2019.00386 – year: 2019 ident: ref27 article-title: See better before looking closer: Weakly supervised data augmentation network for fine-grained visual classification publication-title: arXiv 1901 09891 – year: 2019 ident: ref31 article-title: Learning similarity attention publication-title: arXiv 1911 07381 – year: 2017 ident: ref26 article-title: Improved regularization of convolutional neural networks with cutout publication-title: arXiv 1708 04552 – ident: ref73 doi: 10.1109/TIP.2019.2896952 – ident: ref37 doi: 10.1109/ICCV.2017.410 – ident: ref69 doi: 10.1016/j.imavis.2020.103875 – ident: ref52 doi: 10.1109/CVPR.2018.00902 – ident: ref10 doi: 10.1007/978-3-030-01225-0_30 – ident: ref13 doi: 10.1109/CVPR.2019.00871 – ident: ref35 doi: 10.1007/978-3-030-01240-3_15 – ident: ref64 doi: 10.1109/TCSVT.2019.2957467 – ident: ref21 doi: 10.1109/CVPR.2018.00839 – ident: ref51 doi: 10.1109/CVPR.2018.00720 – ident: ref8 doi: 10.1109/CVPR.2018.00051 – ident: ref47 doi: 10.1109/AUTOID.2005.48 – ident: ref61 doi: 10.1109/CVPR.2018.00016 – ident: ref65 doi: 10.1109/CVPR42600.2020.01409 – ident: ref59 doi: 10.1109/CVPRW.2019.00190 – ident: ref30 doi: 10.1109/ICCV.2019.00379 – ident: ref71 doi: 10.1016/j.jvcir.2018.11.044 – ident: ref55 doi: 10.1109/ICCV.2019.00385 – ident: ref7 doi: 10.1109/CVPR.2019.00730 – ident: ref24 doi: 10.1109/ICIP.2019.8803244 – year: 2016 ident: ref15 article-title: Person re-identification: Past, present and future publication-title: arXiv 1610 02984 – ident: ref62 doi: 10.1109/TCSVT.2018.2873599 – start-page: 1 year: 2015 ident: ref4 article-title: Very deep convolutional networks for large-scale image recognition publication-title: Proc 3rd Int Conf Learn Represent (ICLR) Conf Track – ident: ref9 doi: 10.1109/CVPR.2018.00117 – ident: ref58 doi: 10.1109/ICCV.2019.00046 – ident: ref43 doi: 10.1109/CVPR.2019.00148 – ident: ref11 doi: 10.1109/ICIP.2019.8803292 – ident: ref48 doi: 10.1109/CVPR.2018.00562 – ident: ref44 doi: 10.1109/ICCV.2015.133 – ident: ref20 doi: 10.1145/3240508.3240552 – ident: ref18 doi: 10.1109/CVPR.2018.00243 – ident: ref14 doi: 10.1109/TPAMI.2018.2807450 – ident: ref56 doi: 10.1109/ICCV.2019.00380 – ident: ref39 doi: 10.1109/CVPR.2018.00046 – ident: ref60 doi: 10.1109/TIP.2020.2975712 – ident: ref2 doi: 10.1109/CVPR.2016.90 – ident: ref3 doi: 10.1109/CVPR.2015.7298594 – ident: ref49 doi: 10.1109/CVPR.2018.00225 – ident: ref41 doi: 10.1109/CVPR.2016.308 – ident: ref70 doi: 10.1609/aaai.v33i01.33018933 – ident: ref38 doi: 10.1109/ICCV.2019.00844 – ident: ref6 doi: 10.1109/CVPR.2018.00607 – ident: ref32 doi: 10.1109/CVPR.2018.00745 – ident: ref46 doi: 10.1109/CVPR.2014.27 – ident: ref74 doi: 10.1109/TETCI.2018.2883348 – ident: ref29 doi: 10.1109/CVPR.2019.00588 – ident: ref22 doi: 10.1007/978-3-030-60636-7_2 – ident: ref67 doi: 10.1109/ACCESS.2019.2929523 – ident: ref72 doi: 10.1016/j.neucom.2019.10.083 – ident: ref5 doi: 10.1109/CVPR.2018.00960 – ident: ref40 doi: 10.1109/CVPR.2019.01096 – ident: ref17 doi: 10.1109/CVPR.2018.00226 – ident: ref33 doi: 10.1007/978-3-030-01234-2_1 – ident: ref19 doi: 10.1109/CVPR.2019.00076 – ident: ref28 doi: 10.1109/ICCV.2017.381 – ident: ref57 doi: 10.1109/CVPR.2019.00954 – ident: ref45 doi: 10.1007/978-3-319-48881-3_2 – ident: ref34 doi: 10.5244/C.31.18 – ident: ref36 doi: 10.1109/ICIP.2019.8804419 – ident: ref68 doi: 10.1109/TIP.2018.2851098 – ident: ref63 doi: 10.1109/TCSVT.2019.2957539 – start-page: 3183 year: 2020 ident: ref66 article-title: Relation-aware global attention for person re-identification publication-title: Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit – start-page: 2274 year: 2019 ident: ref23 article-title: Multi-branch context-aware network for person re-identification publication-title: Proc IEEE Int Conf Image Process – ident: ref53 doi: 10.1007/978-3-030-01225-0_23 – volume: abs 1703 0 start-page: 1 year: 2017 ident: ref42 article-title: In defense of the triplet loss for person re-identification publication-title: CoRR – ident: ref12 doi: 10.1109/ICIP.2019.8803796 – ident: ref25 doi: 10.1609/aaai.v34i07.7000 – ident: ref1 doi: 10.1016/j.patcog.2019.107016 – ident: ref50 doi: 10.1109/CVPR.2018.00129 |
SSID | ssj0014847 |
Score | 2.6778214 |
Snippet | In the task of person re-identification, the attention mechanism and fine-grained information have been proved to be effective. However, it has been observed... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 3391 |
SubjectTerms | attention deep learning Feature extraction Image recognition Information filters Interference Person re-identification Person Search Robustness Task analysis Training |
Title | Feature Refinement and Filter Network for Person Re-Identification |
URI | https://ieeexplore.ieee.org/document/9285312 https://www.proquest.com/docview/2568777300 |
Volume | 31 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PS8MwFA5zJz34a4rTKT1402xpm7bpUYdjCBuim-xWmjQBUTaZ7cW_3vfSHwwV8VZKUkJe0vd9yXvfI-QyjD0pIp1RIZSiPFOcijjyqGKBF7GAGWXwvGMyDcdzfr8IFi1y3eTCaK1t8Jnu46O9y89WqsCjskHsgXPBksJbQNzKXK3mxoALW0wM4IJLBfixOkGGxYPZ8Ol5BlTQA4aKElcopLDhhGxVlR-_YutfRntkUo-sDCt57Re57KvPb6KN_x36PtmtgKZzU66MA9LSy0OysyE_2CG3iP-KtXYetYG3-AUnXWbO6AWv0J1pGSHuAKx1Hiwwh4a0zOw11VHfEZmP7mbDMa1qKlDlxUFOfS1C18SpzgIgChzzaAVSIimMa6JQ-qkro5RrX2gdc4VprIA4IqNc7RqgH_4xaS9XS31CnChAKR4RppJjB3D8YFgjmQwiIIE-6xK3nuREVYLjWPfiLbHEg8WJNUyChkkqw3TJVdPnvZTb-LN1B2e6aVlNcpf0alsm1Y78SADaofShz9jp773OyLaH8So2fqxH2vm60OcAOHJ5YVfaF-cCzx8 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1NTxRBEK3gekAPoqJxFaQPeiK99PR89Rw8CLhZBDYEFsNtmO6pTgxkMexujP4W_4r_zaqe2Q0Rw43E22TSPZnpepl61V31CuBdVmhrcqylMc7JpHaJNEWupVOpzlWqvPO833E4zAanyeez9GwJfi1qYRAxJJ9hjy_DWX595Wa8VbZVaHIukW5TKPfxx3cK0CYf9nbJmu-17n8a7Qxk20NAOl2kUxmjySJfVFinRIwTrhs1HAJY4yOfZzauIptXCcYGsUgcl22Sh829izDyRLdjeu4DeEg8I9VNddjijCIxoX0ZEZRIGvKc85IcVWyNdk6-jCj41BQTs6gWSzfccHuhj8utn3_waP0V-D1fiyaR5aI3m9qe-_mXTOT_ulhP4UlLpcXHBvvPYAnHz-HxDYHFVdhmhju7RnGMnu7yG4tqXIv-V04SEMMmB14QcRdHIfSggbKpXfbtZuYLOL2Xj3gJnfHVGF-ByFMWGzJZZROeQNSGoOutsmlOYW6suhDNjVq6VlKdO3tcliG0UkUZgFAyEMoWCF3YXMz51giK3Dl6lS27GNkatQtrc-yU7T9nUhJ5ZXHHWKnX_561AcuD0eFBebA33H8DjzRn54RsuTXoTK9nuE70amrfBpQLOL9vpPwBWFgqcw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Feature+Refinement+and+Filter+Network+for+Person+Re-Identification&rft.jtitle=IEEE+transactions+on+circuits+and+systems+for+video+technology&rft.au=Ning%2C+Xin&rft.au=Gong%2C+Ke&rft.au=Li%2C+Weijun&rft.au=Zhang%2C+Liping&rft.date=2021-09-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=1051-8215&rft.eissn=1558-2205&rft.volume=31&rft.issue=9&rft.spage=3391&rft_id=info:doi/10.1109%2FTCSVT.2020.3043026&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1051-8215&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1051-8215&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1051-8215&client=summon |