Radar gait recognition using Dual-branch Swin Transformer with Asymmetric Attention Fusion
Video-based gait recognition suffers from potential privacy issues and performance degradation due to dim environments, partial occlusions, or camera view changes. Radar has recently become increasingly popular and overcome various challenges presented by vision sensors. To capture tiny differences...
Saved in:
Published in | Pattern recognition Vol. 159; p. 111101 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Ltd
01.03.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Video-based gait recognition suffers from potential privacy issues and performance degradation due to dim environments, partial occlusions, or camera view changes. Radar has recently become increasingly popular and overcome various challenges presented by vision sensors. To capture tiny differences in radar gait signatures of different people, a dual-branch Swin Transformer is proposed, where one branch captures the time variations of the radar micro-Doppler signature and the other captures the repetitive frequency patterns in the spectrogram. Unlike natural images where objects can be translated, rotated, or scaled, the spatial coordinates of spectrograms and CVDs have unique physical meanings, and there is no affine transformation for radar targets in these synthetic images. The patch splitting mechanism in Vision Transformer makes it ideal to extract discriminant information from patches, and learn the attentive information across patches, as each patch carries some unique physical properties of radar targets. Swin Transformer consists of a set of cascaded Swin blocks to extract semantic features from shallow to deep representations, further improving the classification performance. Lastly, to highlight the branch with larger discriminant power, an Asymmetric Attention Fusion is proposed to optimally fuse the discriminant features from the two branches. To enrich the research on radar gait recognition, a large-scale NTU-RGR dataset is constructed, containing 45,768 radar frames of 98 subjects. The proposed method is evaluated on the NTU-RGR dataset and the MMRGait-1.0 database. It consistently and significantly outperforms all the compared methods on both datasets. The codes are available at:https://github.com/wentaoheunnc/NTU-RGR.
•The proposed method could well extract complementary information from both spectrograms and CVDs.•The proposed Swin-T could extract discriminant features with physical meanings.•The proposed asymmetric attention fusion could effectively combine features with known importance.•A large-scale benchmark dataset, NTU-RGR dataset, is developed to advance the radar gait recognition. |
---|---|
AbstractList | Video-based gait recognition suffers from potential privacy issues and performance degradation due to dim environments, partial occlusions, or camera view changes. Radar has recently become increasingly popular and overcome various challenges presented by vision sensors. To capture tiny differences in radar gait signatures of different people, a dual-branch Swin Transformer is proposed, where one branch captures the time variations of the radar micro-Doppler signature and the other captures the repetitive frequency patterns in the spectrogram. Unlike natural images where objects can be translated, rotated, or scaled, the spatial coordinates of spectrograms and CVDs have unique physical meanings, and there is no affine transformation for radar targets in these synthetic images. The patch splitting mechanism in Vision Transformer makes it ideal to extract discriminant information from patches, and learn the attentive information across patches, as each patch carries some unique physical properties of radar targets. Swin Transformer consists of a set of cascaded Swin blocks to extract semantic features from shallow to deep representations, further improving the classification performance. Lastly, to highlight the branch with larger discriminant power, an Asymmetric Attention Fusion is proposed to optimally fuse the discriminant features from the two branches. To enrich the research on radar gait recognition, a large-scale NTU-RGR dataset is constructed, containing 45,768 radar frames of 98 subjects. The proposed method is evaluated on the NTU-RGR dataset and the MMRGait-1.0 database. It consistently and significantly outperforms all the compared methods on both datasets. The codes are available at:https://github.com/wentaoheunnc/NTU-RGR.
•The proposed method could well extract complementary information from both spectrograms and CVDs.•The proposed Swin-T could extract discriminant features with physical meanings.•The proposed asymmetric attention fusion could effectively combine features with known importance.•A large-scale benchmark dataset, NTU-RGR dataset, is developed to advance the radar gait recognition. |
ArticleNumber | 111101 |
Author | He, Wentao Jiang, Xudong Ren, Jianfeng Bai, Ruibin |
Author_xml | – sequence: 1 givenname: Wentao orcidid: 0000-0002-6319-1639 surname: He fullname: He, Wentao organization: Faculty of Electrical Engineering and Computer Science, Ningbo University, Ningbo 315211, China – sequence: 2 givenname: Jianfeng orcidid: 0000-0003-4619-6590 surname: Ren fullname: Ren, Jianfeng email: jianfeng.ren@nottingham.edu.cn organization: The Digital Port Technologies Lab, School of Computer Science, University of Nottingham Ningbo China, 199 Taikang East Road, Ningbo 315100, China – sequence: 3 givenname: Ruibin orcidid: 0000-0003-1722-568X surname: Bai fullname: Bai, Ruibin organization: The Digital Port Technologies Lab, School of Computer Science, University of Nottingham Ningbo China, 199 Taikang East Road, Ningbo 315100, China – sequence: 4 givenname: Xudong orcidid: 0000-0002-9104-2315 surname: Jiang fullname: Jiang, Xudong organization: School of Electrical & Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, 639798, Singapore |
BookMark | eNp9kM1OAjEURrvAREDfwEVfYMa2M0NhY0JQ1ITERHHjpunPLZQwHdIWCW9vcVx7N_fbnC_3nhEa-M4DQneUlJTQyf2uPMiku03JCKtLmofQARoSUtGiYqS6RqMYd4RQTms2RF_v0siAN9IlHCBz3iXXeXyMzm_w41HuCxWk11v8cXIer3OOtgstBHxyaYvn8dy2kILTeJ4S-F94menO36ArK_cRbv_2GH0un9aLl2L19vy6mK8KXRGaispwC0zOLDdA6dRYohhICVbNTCOJIlOtGsUpl6yxfMqNUbzJcQJqVtv81xjVfa8OXYwBrDgE18pwFpSIixOxE70TcXEieicZe-gxyLd9Owgiagdeg3FZRBKmc_8X_AC4nXKW |
Cites_doi | 10.1016/j.neucom.2022.06.048 10.1609/aaai.v37i1.25072 10.1016/j.patcog.2017.09.005 10.1049/iet-rsn.2020.0183 10.1109/TIFS.2023.3254449 10.1109/TPAMI.2020.2981604 10.1109/ICASSP43922.2022.9746565 10.1109/TIP.2023.3266161 10.1109/ICCV48922.2021.00986 10.1016/j.patcog.2010.10.011 10.1109/TIFS.2012.2204253 10.1109/MSP.2018.2890128 10.1109/JSEN.2020.3046991 10.1016/j.patcog.2019.107069 10.1016/j.neucom.2021.04.081 10.1109/TPAMI.2006.38 10.1109/SEAI55746.2022.9832301 10.1049/iet-rsn.2017.0511 10.1016/j.patcog.2020.107709 10.1109/JIOT.2019.2929833 10.1609/aaai.v34i01.5430 10.3390/rs13020241 10.1109/JIOT.2023.3242417 10.1016/j.patcog.2017.04.024 10.1049/rsn2.12249 10.1109/TGRS.2019.2929096 10.1016/j.patcog.2022.108520 10.1016/j.patcog.2018.10.019 10.1109/TPAMI.2016.2545669 10.1017/S1759078721000830 10.1109/MetroAeroSpace48742.2020.9160199 10.1109/ICASSP49357.2023.10095141 10.1109/TMM.2023.3262131 10.1109/JBHI.2023.3240895 10.3390/rs12142237 10.1109/LMWC.2019.2907547 10.1109/TIP.2019.2926208 10.1109/LGRS.2018.2806940 10.1109/JSEN.2023.3308788 10.1109/TIFS.2019.2922241 10.1109/TPAMI.2020.2998790 10.1109/RadarConf2147009.2021.9455218 10.1109/LGRS.2016.2624820 10.1145/3664647.3680820 10.1049/el.2017.4317 10.1109/TBME.2019.2893528 10.1016/j.eswa.2024.124824 10.1016/j.patcog.2018.07.030 10.1117/12.2501770 10.1016/j.eswa.2022.117588 |
ContentType | Journal Article |
Copyright | 2024 The Authors |
Copyright_xml | – notice: 2024 The Authors |
DBID | 6I. AAFTH AAYXX CITATION |
DOI | 10.1016/j.patcog.2024.111101 |
DatabaseName | ScienceDirect Open Access Titles Elsevier:ScienceDirect:Open Access CrossRef |
DatabaseTitle | CrossRef |
DatabaseTitleList | |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
ExternalDocumentID | 10_1016_j_patcog_2024_111101 S0031320324008525 |
GroupedDBID | --K --M -D8 -DT -~X .DC .~1 0R~ 123 1B1 1RT 1~. 1~5 29O 4.4 457 4G. 53G 5VS 6I. 7-5 71M 8P~ 9JN AABNK AACTN AAEDT AAEDW AAFTH AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AAXKI AAXUO AAYFN ABBOA ABDPE ABEFU ABFNM ABFRF ABHFT ABJNI ABMAC ABTAH ABWVN ABXDB ACBEA ACDAQ ACGFO ACGFS ACNNM ACRLP ACRPL ACZNC ADBBV ADEZE ADJOM ADMUD ADMXK ADNMO ADTZH AEBSH AECPX AEFWE AEKER AENEX AFJKZ AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJOXV AKRWK ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ AOUOD ASPBG AVWKF AXJTR AZFZN BJAXD BKOJK BLXMC CS3 DU5 EBS EFJIC EJD EO8 EO9 EP2 EP3 F0J F5P FD6 FDB FEDTE FGOYB FIRID FNPLU FYGXN G-Q GBLVA GBOLZ HLZ HVGLF HZ~ H~9 IHE J1W JJJVA KOM KZ1 LG9 LMP LY1 M41 MO0 N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. Q38 R2- RIG RNS ROL RPZ SBC SDF SDG SDP SDS SES SEW SPC SPCBC SST SSV SSZ T5K TN5 UNMZH VOH WUQ XJE XPP ZMT ZY4 ~G- AATTM AAYWO AAYXX ACVFH ADCNI AEIPS AEUPX AFPUW AFXIZ AGCQF AGQPQ AGRNS AIGII AIIUN AKBMS AKYEP ANKPU APXCP BNPGV CITATION SSH |
ID | FETCH-LOGICAL-c301t-3d7fe2a9f7de118df0b2eaaefb9d5a0b08cb5b717a25f787ddb7525f6eb94f003 |
IEDL.DBID | .~1 |
ISSN | 0031-3203 |
IngestDate | Tue Jul 01 02:36:49 EDT 2025 Sat Dec 14 16:14:41 EST 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | Asymmetric Attention Fusion Micro-Doppler signature Radar gait recognition Spectrogram Cadence velocity diagram |
Language | English |
License | This is an open access article under the CC BY license. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c301t-3d7fe2a9f7de118df0b2eaaefb9d5a0b08cb5b717a25f787ddb7525f6eb94f003 |
ORCID | 0000-0003-1722-568X 0000-0002-6319-1639 0000-0003-4619-6590 0000-0002-9104-2315 |
OpenAccessLink | https://www.sciencedirect.com/science/article/pii/S0031320324008525 |
ParticipantIDs | crossref_primary_10_1016_j_patcog_2024_111101 elsevier_sciencedirect_doi_10_1016_j_patcog_2024_111101 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | March 2025 2025-03-00 |
PublicationDateYYYYMMDD | 2025-03-01 |
PublicationDate_xml | – month: 03 year: 2025 text: March 2025 |
PublicationDecade | 2020 |
PublicationTitle | Pattern recognition |
PublicationYear | 2025 |
Publisher | Elsevier Ltd |
Publisher_xml | – sequence: 0 name: Elsevier Ltd |
References | Song, Dai, Jin, Song (b6) 2023; 23 Li, Mehul, Le Kernec, Gurbuz, Fioranelli (b14) 2020; 21 Cao, Xia, Ye, Zhang, Zhou (b23) 2018; 12 Ren, Jiang (b3) 2017; 69 Lam, Cheung, Liu (b39) 2011; 44 Ni, Huang (b24) 2020; 14 Yang, Hou, Lang, Guan, Huang, Xu (b15) 2019; 85 Guendel, Fioranelli, Yarovoy (b51) 2022; 16 Yao, Ren, Bai, Du, Liu, Jiang (b47) 2024; 255 J. Zhang, X. Wang, C. Yao, J. Ren, X. Jiang, Visual-linguistic Cross-domain Feature Learning with Group Attention and Gamma-correct Gated Fusion for Extracting Commonsense Knowledge, in: Proc. ACM MM, 2024, pp. 4650–4659. Xia, Ding, Wang, Xu (b43) 2021; 19 Gurbuz, Amin (b17) 2019; 36 Lang, Wang, Yang, Hou, Liu, He (b44) 2019; 6 Chen, Hu, Lei, Chen, Robertson, Li (b26) 2019; 15 Liao, An, Li, Bhattacharyya (b37) 2021; 453 Liao, Yu, An, Huang (b41) 2020; 98 Du, Chen, Shi, Xue, Xie (b29) 2023; 12 Zhang, Tran, Liu, Liu (b30) 2022; 44 Kim, Kang, Park (b5) 2016; 14 Li, Li, Fioranelli, Yang, Romain, Kernec (b12) 2020; 12 Song, Jin, Dai, Song, Zhou (b7) 2021; 13 Z. Wang, C. Yao, J. Ren, M. Feng, X. Jiang, Human Activity Recognition Using 3D Orthogonally-projected EfficientNet on Radar Time-Range-Doppler Signature, in: Proc. IEEE Int. Conf. Softw. Eng. Artif. Intell., SEAI, 2022, pp. 26–30. Z. Meng, S. Fu, J. Yan, H. Liang, A. Zhou, S. Zhu, H. Ma, J. Liu, N. Yang, Gait recognition for co-existing multiple people using millimeter wave sensing, in: Proc. AAAI, 2020, pp. 849–856. Zhang, Ren, Zhang, Liu, Jiang (b52) 2023; 32 W. He, J. Zhang, J. Ren, R. Bai, X. Jiang, Hierarchical ConViT with Attention-based Relational Reasoner for Visual Analogical Reasoning, in: Proc. AAAI, 37, 2023, pp. 22–30. Deng, Fan, Lin, Feng (b33) 2023; 26 Chen, Li, Fioranelli, Griffiths (b18) 2018; 15 Bai, Hui, Wang, Zhou (b10) 2019; 57 Yang, Hou, Lang, Yue, He, Xiang (b45) 2019; 29 Ren, Jiang (b4) 2021; 111 Geng, Huang, Chen (b28) 2020; 43 Gadaleta, Rossi (b32) 2018; 74 Zhang, Huang, Yu, Wang (b1) 2019; 29 Han, Bhanu (b38) 2005; 28 Liu, Ren, Lu, He, Cui, Zhang, Bai (b53) 2022; 205 S. Chen, W. He, J. Ren, X. Jiang, Attention-based dual-stream vision transformer for radar gait recognition, in: Proc. ICASSP, 2022, pp. 3668–3672. Wu, Huang, Wang, Wang, Tan (b2) 2016; 39 Z. Li, J. Le Kernec, F. Fioranelli, O. Romain, L. Zhang, S. Yang, An LSTM Approach to Short-range personnel recognition using Radar Signals, in: Proc. IEEE Radar Conf., 2021, pp. 1–6. J. Ren, X. Jiang, Radar micro-Doppler signature analysis and its application on gait recognition, in: Proc. Int. Worksh. Pattern Recognit., Vol. 10828, 2018, pp. 13–17. R. Ji, J. Li, W. He, J. Ren, X. Jiang, Dual-Stream Siamese Vision Transformer with Mutual Attention for Radar Gait Verification, in: Proc. ICASSP, 2023, pp. 1–5. Iwama, Okumura, Makihara, Yagi (b49) 2012; 7 Doherty, Burgueño, Trommel, Papanastasiou, Harmanny (b42) 2021; 13 Liu, You, He, Bi, Wang (b36) 2022; 125 Zhang, Li (b16) 2018; 54 Shi, Du, Chen, Liao, Yu, Li, Wang, Xue (b25) 2023; 10 Yang, Kernec, Romain, Fioranelli, Cadart, Fix, Ren, Manfredi, Letertre, Sáenz, Zhang, Liang, Wang, Li, Chen, Liu, Chen, Li, Wu, Chen, Jin (b9) 2023; 27 P. Addabbo, M.L. Bernardi, F. Biondi, M. Cimitile, C. Clemente, D. Orlando, Gait recognition using FMCW radar and temporal convolutional deep neural networks, in: Proc. IEEE Int. Worksh. Metrol. Aerosp., 2020, pp. 171–175. Seifert, Amin, Zoubir (b11) 2019; 66 Z. Liu, Y. Lin, Y. Cao, H. Hu, Y. Wei, Z. Zhang, S. Lin, B. Guo, Swin transformer: Hierarchical vision transformer using shifted windows, in: Proc. ICCV, 2021, pp. 10012–10022. Liao, Li, Bhattacharyya, York (b34) 2022; 501 A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, N. Houlsby, An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale, in: Proc. ICLR, 2020. Pan, Chen, Xu, He, He (b35) 2023; 18 S. Yu, D. Tan, T. Tan, A framework for evaluating the effect of view angle, clothing and carrying condition on gait recognition, in: Proc. ICPR, 4, 2006, pp. 441–444. Yu, Chen, Garcia Reyes, Poh (b40) 2019; 87 Gurbuz (10.1016/j.patcog.2024.111101_b17) 2019; 36 10.1016/j.patcog.2024.111101_b22 Yang (10.1016/j.patcog.2024.111101_b9) 2023; 27 Chen (10.1016/j.patcog.2024.111101_b18) 2018; 15 Zhang (10.1016/j.patcog.2024.111101_b16) 2018; 54 Ren (10.1016/j.patcog.2024.111101_b3) 2017; 69 Ni (10.1016/j.patcog.2024.111101_b24) 2020; 14 10.1016/j.patcog.2024.111101_b20 10.1016/j.patcog.2024.111101_b21 Lang (10.1016/j.patcog.2024.111101_b44) 2019; 6 Gadaleta (10.1016/j.patcog.2024.111101_b32) 2018; 74 Xia (10.1016/j.patcog.2024.111101_b43) 2021; 19 Cao (10.1016/j.patcog.2024.111101_b23) 2018; 12 Wu (10.1016/j.patcog.2024.111101_b2) 2016; 39 10.1016/j.patcog.2024.111101_b19 Geng (10.1016/j.patcog.2024.111101_b28) 2020; 43 Han (10.1016/j.patcog.2024.111101_b38) 2005; 28 Du (10.1016/j.patcog.2024.111101_b29) 2023; 12 10.1016/j.patcog.2024.111101_b13 Yu (10.1016/j.patcog.2024.111101_b40) 2019; 87 Liu (10.1016/j.patcog.2024.111101_b36) 2022; 125 10.1016/j.patcog.2024.111101_b54 Chen (10.1016/j.patcog.2024.111101_b26) 2019; 15 10.1016/j.patcog.2024.111101_b50 Liu (10.1016/j.patcog.2024.111101_b53) 2022; 205 Liao (10.1016/j.patcog.2024.111101_b37) 2021; 453 Zhang (10.1016/j.patcog.2024.111101_b30) 2022; 44 Yang (10.1016/j.patcog.2024.111101_b45) 2019; 29 Seifert (10.1016/j.patcog.2024.111101_b11) 2019; 66 10.1016/j.patcog.2024.111101_b48 Doherty (10.1016/j.patcog.2024.111101_b42) 2021; 13 10.1016/j.patcog.2024.111101_b46 Song (10.1016/j.patcog.2024.111101_b7) 2021; 13 Yang (10.1016/j.patcog.2024.111101_b15) 2019; 85 Bai (10.1016/j.patcog.2024.111101_b10) 2019; 57 Li (10.1016/j.patcog.2024.111101_b12) 2020; 12 Deng (10.1016/j.patcog.2024.111101_b33) 2023; 26 Iwama (10.1016/j.patcog.2024.111101_b49) 2012; 7 Lam (10.1016/j.patcog.2024.111101_b39) 2011; 44 Zhang (10.1016/j.patcog.2024.111101_b1) 2019; 29 Zhang (10.1016/j.patcog.2024.111101_b52) 2023; 32 10.1016/j.patcog.2024.111101_b8 Li (10.1016/j.patcog.2024.111101_b14) 2020; 21 Kim (10.1016/j.patcog.2024.111101_b5) 2016; 14 Liao (10.1016/j.patcog.2024.111101_b34) 2022; 501 10.1016/j.patcog.2024.111101_b31 Ren (10.1016/j.patcog.2024.111101_b4) 2021; 111 Song (10.1016/j.patcog.2024.111101_b6) 2023; 23 Pan (10.1016/j.patcog.2024.111101_b35) 2023; 18 Yao (10.1016/j.patcog.2024.111101_b47) 2024; 255 Shi (10.1016/j.patcog.2024.111101_b25) 2023; 10 Liao (10.1016/j.patcog.2024.111101_b41) 2020; 98 10.1016/j.patcog.2024.111101_b27 Guendel (10.1016/j.patcog.2024.111101_b51) 2022; 16 |
References_xml | – volume: 57 start-page: 9767 year: 2019 end-page: 9778 ident: b10 article-title: Radar-based human gait recognition using dual-channel deep convolutional neural network publication-title: IEEE Trans. Geosci. Remote Sens. – volume: 29 start-page: 1001 year: 2019 end-page: 1015 ident: b1 article-title: Cross-view gait recognition by discriminative feature learning publication-title: IEEE Trans. Image Process. – volume: 125 year: 2022 ident: b36 article-title: Symmetry-driven hyper feature GCN for skeleton-based gait recognition publication-title: Pattern Recognit. – volume: 28 start-page: 316 year: 2005 end-page: 322 ident: b38 article-title: Individual recognition using gait energy image publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 111 year: 2021 ident: b4 article-title: A three-step classification framework to handle complex data distribution for radar UAV detection publication-title: Pattern Recognit. – volume: 85 start-page: 60 year: 2019 end-page: 69 ident: b15 article-title: Open-set human activity recognition based on micro-Doppler signatures publication-title: Pattern Recognit. – volume: 19 start-page: 1 year: 2021 end-page: 5 ident: b43 article-title: Person identification with millimeter-wave radar in realistic smart home scenarios publication-title: IEEE Geosci. Remote Sens. Lett. – reference: Z. Liu, Y. Lin, Y. Cao, H. Hu, Y. Wei, Z. Zhang, S. Lin, B. Guo, Swin transformer: Hierarchical vision transformer using shifted windows, in: Proc. ICCV, 2021, pp. 10012–10022. – volume: 501 start-page: 514 year: 2022 end-page: 528 ident: b34 article-title: PoseMapGait: A model-based gait recognition method with pose estimation maps and graph convolutional networks publication-title: Neurocomputing – volume: 23 start-page: 23927 year: 2023 end-page: 23940 ident: b6 article-title: Dual-task human activity sensing for pose reconstruction and action recognition using 4D imaging radar publication-title: IEEE Sens. J. – volume: 36 start-page: 16 year: 2019 end-page: 28 ident: b17 article-title: Radar-based human-motion recognition with deep learning: Promising applications for indoor monitoring publication-title: IEEE Signal Process. Mag. – reference: A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, N. Houlsby, An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale, in: Proc. ICLR, 2020. – volume: 14 start-page: 1640 year: 2020 ident: b24 article-title: Human identification based on natural gait micro-Doppler signatures using deep transfer learning publication-title: IET Radar Sonar Navig. – volume: 26 start-page: 117 year: 2023 end-page: 126 ident: b33 article-title: Human gait recognition based on frontal-view sequences using gait dynamics and deep learning publication-title: IEEE Trans. Multimed. – volume: 12 start-page: 892 year: 2023 end-page: 905 ident: b29 article-title: MMRGait-1.0: A radar time-frequency spectrogram dataset for gait recognition under multi-view and multi-wearing conditions publication-title: J. Radars – reference: W. He, J. Zhang, J. Ren, R. Bai, X. Jiang, Hierarchical ConViT with Attention-based Relational Reasoner for Visual Analogical Reasoning, in: Proc. AAAI, 37, 2023, pp. 22–30. – volume: 453 start-page: 13 year: 2021 end-page: 25 ident: b37 article-title: A novel view synthesis approach based on view space covering for gait recognition publication-title: Neurocomputing – volume: 15 start-page: 578 year: 2019 end-page: 593 ident: b26 article-title: Attention-based two-stream convolutional networks for face spoofing detection publication-title: IEEE Trans. Inf. Forensics Secur. – volume: 74 start-page: 25 year: 2018 end-page: 37 ident: b32 article-title: IDNet: Smartphone-based gait recognition with convolutional neural networks publication-title: Pattern Recognit. – reference: Z. Li, J. Le Kernec, F. Fioranelli, O. Romain, L. Zhang, S. Yang, An LSTM Approach to Short-range personnel recognition using Radar Signals, in: Proc. IEEE Radar Conf., 2021, pp. 1–6. – volume: 13 start-page: 241 year: 2021 ident: b7 article-title: Through-wall human pose reconstruction via UWB MIMO radar and 3D CNN publication-title: Remote Sens. – volume: 98 year: 2020 ident: b41 article-title: A model-based gait recognition method with body pose and human prior knowledge publication-title: Pattern Recognit. – reference: J. Zhang, X. Wang, C. Yao, J. Ren, X. Jiang, Visual-linguistic Cross-domain Feature Learning with Group Attention and Gamma-correct Gated Fusion for Extracting Commonsense Knowledge, in: Proc. ACM MM, 2024, pp. 4650–4659. – reference: Z. Meng, S. Fu, J. Yan, H. Liang, A. Zhou, S. Zhu, H. Ma, J. Liu, N. Yang, Gait recognition for co-existing multiple people using millimeter wave sensing, in: Proc. AAAI, 2020, pp. 849–856. – reference: S. Yu, D. Tan, T. Tan, A framework for evaluating the effect of view angle, clothing and carrying condition on gait recognition, in: Proc. ICPR, 4, 2006, pp. 441–444. – volume: 16 start-page: 1144 year: 2022 end-page: 1161 ident: b51 article-title: Distributed radar fusion and recurrent networks for classification of continuous human activities publication-title: IET Radar Sonar Navig. – volume: 44 start-page: 973 year: 2011 end-page: 987 ident: b39 article-title: Gait flow image: A silhouette-based gait representation for human identification publication-title: Pattern Recognit. – volume: 13 start-page: 734 year: 2021 end-page: 739 ident: b42 article-title: Attention-based deep learning networks for identification of human gait using radar micro-Doppler spectrograms publication-title: Int. J. Microw. Wireless Technol. – reference: R. Ji, J. Li, W. He, J. Ren, X. Jiang, Dual-Stream Siamese Vision Transformer with Mutual Attention for Radar Gait Verification, in: Proc. ICASSP, 2023, pp. 1–5. – volume: 12 start-page: 729 year: 2018 end-page: 734 ident: b23 article-title: Radar-ID: human identification based on radar micro-Doppler signatures using deep convolutional neural networks publication-title: IET Radar Sonar Navig. – volume: 54 start-page: 441 year: 2018 end-page: 443 ident: b16 article-title: Detection of multiple micro-drones via cadence velocity diagram analysis publication-title: Electron. Lett. – reference: Z. Wang, C. Yao, J. Ren, M. Feng, X. Jiang, Human Activity Recognition Using 3D Orthogonally-projected EfficientNet on Radar Time-Range-Doppler Signature, in: Proc. IEEE Int. Conf. Softw. Eng. Artif. Intell., SEAI, 2022, pp. 26–30. – volume: 69 start-page: 225 year: 2017 end-page: 237 ident: b3 article-title: Regularized 2-D complex-log spectral analysis and subspace reliability analysis of micro-Doppler signature for UAV detection publication-title: Pattern Recognit. – volume: 43 start-page: 3614 year: 2020 end-page: 3631 ident: b28 article-title: Recent advances in open set recognition: A survey publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 205 year: 2022 ident: b53 article-title: Cross-document attention-based gated fusion network for automated medical licensing exam publication-title: Expert Syst. Appl. – volume: 27 start-page: 1813 year: 2023 end-page: 1824 ident: b9 article-title: The human activity radar challenge: Benchmarking based on the ‘radar signatures of human activities’ dataset from Glasgow university publication-title: IEEE J. Biomed. Health Inform. – reference: J. Ren, X. Jiang, Radar micro-Doppler signature analysis and its application on gait recognition, in: Proc. Int. Worksh. Pattern Recognit., Vol. 10828, 2018, pp. 13–17. – volume: 7 start-page: 1511 year: 2012 end-page: 1521 ident: b49 article-title: The OU-ISIR gait database comprising the large population dataset and performance evaluation of gait recognition publication-title: IEEE Trans. Inf. Forensics Secur. – volume: 39 start-page: 209 year: 2016 end-page: 226 ident: b2 article-title: A comprehensive study on cross-view gait based human identification with deep CNNs publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 15 start-page: 669 year: 2018 end-page: 673 ident: b18 article-title: Personnel recognition and gait classification based on multistatic micro-Doppler signatures using deep convolutional neural networks publication-title: IEEE Geosci. Remote Sens. Lett. – volume: 12 start-page: 2237 year: 2020 ident: b12 article-title: Hierarchical radar data analysis for activity and personnel recognition publication-title: Remote Sens. – volume: 6 start-page: 9596 year: 2019 end-page: 9605 ident: b44 article-title: Joint motion classification and person identification via multitask learning for smart homes publication-title: IEEE Internet Things J. – volume: 255 year: 2024 ident: b47 article-title: Progressively-orthogonally-mapped EfficientNet for action recognition on time-range-Doppler signature publication-title: Expert Syst. Appl. – volume: 10 start-page: 10817 year: 2023 end-page: 10832 ident: b25 article-title: Robust gait recognition based on deep CNNs with camera and radar sensor fusion publication-title: IEEE Internet Things J. – volume: 44 start-page: 345 year: 2022 end-page: 360 ident: b30 article-title: On learning disentangled representations for gait recognition publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 14 start-page: 38 year: 2016 end-page: 42 ident: b5 article-title: Drone classification using convolutional neural networks with merged Doppler images publication-title: IEEE Geosci. Remote Sens. Lett. – volume: 66 start-page: 2629 year: 2019 end-page: 2640 ident: b11 article-title: Toward unobtrusive in-home gait analysis based on radar micro-Doppler signatures publication-title: IEEE Trans. Biomed. Eng. – reference: S. Chen, W. He, J. Ren, X. Jiang, Attention-based dual-stream vision transformer for radar gait recognition, in: Proc. ICASSP, 2022, pp. 3668–3672. – reference: P. Addabbo, M.L. Bernardi, F. Biondi, M. Cimitile, C. Clemente, D. Orlando, Gait recognition using FMCW radar and temporal convolutional deep neural networks, in: Proc. IEEE Int. Worksh. Metrol. Aerosp., 2020, pp. 171–175. – volume: 87 start-page: 179 year: 2019 end-page: 189 ident: b40 article-title: GaitGANv2: Invariant gait feature extraction using generative adversarial networks publication-title: Pattern Recognit. – volume: 29 start-page: 366 year: 2019 end-page: 368 ident: b45 article-title: Person identification using micro-Doppler signatures of human motions and UWB radar publication-title: IEEE Microw. Wirel. Compon. Lett. – volume: 32 start-page: 3000 year: 2023 end-page: 3012 ident: b52 article-title: Spatial context-aware object-attentional network for multi-label image classification publication-title: IEEE Trans. Image Process. – volume: 18 start-page: 2104 year: 2023 end-page: 2118 ident: b35 article-title: Toward complete-view and high-level pose-based gait recognition publication-title: IEEE Trans. Inf. Forensics Secur. – volume: 21 start-page: 7590 year: 2020 end-page: 7603 ident: b14 article-title: Sequential human gait classification with distributed radar sensor fusion publication-title: IEEE Sens. J. – volume: 501 start-page: 514 year: 2022 ident: 10.1016/j.patcog.2024.111101_b34 article-title: PoseMapGait: A model-based gait recognition method with pose estimation maps and graph convolutional networks publication-title: Neurocomputing doi: 10.1016/j.neucom.2022.06.048 – ident: 10.1016/j.patcog.2024.111101_b22 doi: 10.1609/aaai.v37i1.25072 – volume: 74 start-page: 25 year: 2018 ident: 10.1016/j.patcog.2024.111101_b32 article-title: IDNet: Smartphone-based gait recognition with convolutional neural networks publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2017.09.005 – volume: 14 start-page: 1640 issue: 10 year: 2020 ident: 10.1016/j.patcog.2024.111101_b24 article-title: Human identification based on natural gait micro-Doppler signatures using deep transfer learning publication-title: IET Radar Sonar Navig. doi: 10.1049/iet-rsn.2020.0183 – volume: 18 start-page: 2104 year: 2023 ident: 10.1016/j.patcog.2024.111101_b35 article-title: Toward complete-view and high-level pose-based gait recognition publication-title: IEEE Trans. Inf. Forensics Secur. doi: 10.1109/TIFS.2023.3254449 – volume: 43 start-page: 3614 issue: 10 year: 2020 ident: 10.1016/j.patcog.2024.111101_b28 article-title: Recent advances in open set recognition: A survey publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2020.2981604 – ident: 10.1016/j.patcog.2024.111101_b27 doi: 10.1109/ICASSP43922.2022.9746565 – volume: 32 start-page: 3000 year: 2023 ident: 10.1016/j.patcog.2024.111101_b52 article-title: Spatial context-aware object-attentional network for multi-label image classification publication-title: IEEE Trans. Image Process. doi: 10.1109/TIP.2023.3266161 – ident: 10.1016/j.patcog.2024.111101_b21 doi: 10.1109/ICCV48922.2021.00986 – volume: 44 start-page: 973 issue: 4 year: 2011 ident: 10.1016/j.patcog.2024.111101_b39 article-title: Gait flow image: A silhouette-based gait representation for human identification publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2010.10.011 – volume: 7 start-page: 1511 issue: 5 year: 2012 ident: 10.1016/j.patcog.2024.111101_b49 article-title: The OU-ISIR gait database comprising the large population dataset and performance evaluation of gait recognition publication-title: IEEE Trans. Inf. Forensics Secur. doi: 10.1109/TIFS.2012.2204253 – volume: 36 start-page: 16 issue: 4 year: 2019 ident: 10.1016/j.patcog.2024.111101_b17 article-title: Radar-based human-motion recognition with deep learning: Promising applications for indoor monitoring publication-title: IEEE Signal Process. Mag. doi: 10.1109/MSP.2018.2890128 – volume: 21 start-page: 7590 issue: 6 year: 2020 ident: 10.1016/j.patcog.2024.111101_b14 article-title: Sequential human gait classification with distributed radar sensor fusion publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2020.3046991 – volume: 12 start-page: 892 issue: R22227 year: 2023 ident: 10.1016/j.patcog.2024.111101_b29 article-title: MMRGait-1.0: A radar time-frequency spectrogram dataset for gait recognition under multi-view and multi-wearing conditions publication-title: J. Radars – volume: 98 year: 2020 ident: 10.1016/j.patcog.2024.111101_b41 article-title: A model-based gait recognition method with body pose and human prior knowledge publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2019.107069 – volume: 453 start-page: 13 year: 2021 ident: 10.1016/j.patcog.2024.111101_b37 article-title: A novel view synthesis approach based on view space covering for gait recognition publication-title: Neurocomputing doi: 10.1016/j.neucom.2021.04.081 – volume: 28 start-page: 316 issue: 2 year: 2005 ident: 10.1016/j.patcog.2024.111101_b38 article-title: Individual recognition using gait energy image publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2006.38 – ident: 10.1016/j.patcog.2024.111101_b8 doi: 10.1109/SEAI55746.2022.9832301 – volume: 12 start-page: 729 issue: 7 year: 2018 ident: 10.1016/j.patcog.2024.111101_b23 article-title: Radar-ID: human identification based on radar micro-Doppler signatures using deep convolutional neural networks publication-title: IET Radar Sonar Navig. doi: 10.1049/iet-rsn.2017.0511 – volume: 111 year: 2021 ident: 10.1016/j.patcog.2024.111101_b4 article-title: A three-step classification framework to handle complex data distribution for radar UAV detection publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2020.107709 – volume: 6 start-page: 9596 issue: 6 year: 2019 ident: 10.1016/j.patcog.2024.111101_b44 article-title: Joint motion classification and person identification via multitask learning for smart homes publication-title: IEEE Internet Things J. doi: 10.1109/JIOT.2019.2929833 – ident: 10.1016/j.patcog.2024.111101_b50 doi: 10.1609/aaai.v34i01.5430 – volume: 19 start-page: 1 year: 2021 ident: 10.1016/j.patcog.2024.111101_b43 article-title: Person identification with millimeter-wave radar in realistic smart home scenarios publication-title: IEEE Geosci. Remote Sens. Lett. – volume: 13 start-page: 241 issue: 2 year: 2021 ident: 10.1016/j.patcog.2024.111101_b7 article-title: Through-wall human pose reconstruction via UWB MIMO radar and 3D CNN publication-title: Remote Sens. doi: 10.3390/rs13020241 – volume: 10 start-page: 10817 issue: 12 year: 2023 ident: 10.1016/j.patcog.2024.111101_b25 article-title: Robust gait recognition based on deep CNNs with camera and radar sensor fusion publication-title: IEEE Internet Things J. doi: 10.1109/JIOT.2023.3242417 – volume: 69 start-page: 225 year: 2017 ident: 10.1016/j.patcog.2024.111101_b3 article-title: Regularized 2-D complex-log spectral analysis and subspace reliability analysis of micro-Doppler signature for UAV detection publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2017.04.024 – volume: 16 start-page: 1144 issue: 7 year: 2022 ident: 10.1016/j.patcog.2024.111101_b51 article-title: Distributed radar fusion and recurrent networks for classification of continuous human activities publication-title: IET Radar Sonar Navig. doi: 10.1049/rsn2.12249 – volume: 57 start-page: 9767 issue: 12 year: 2019 ident: 10.1016/j.patcog.2024.111101_b10 article-title: Radar-based human gait recognition using dual-channel deep convolutional neural network publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2019.2929096 – volume: 125 year: 2022 ident: 10.1016/j.patcog.2024.111101_b36 article-title: Symmetry-driven hyper feature GCN for skeleton-based gait recognition publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2022.108520 – volume: 87 start-page: 179 year: 2019 ident: 10.1016/j.patcog.2024.111101_b40 article-title: GaitGANv2: Invariant gait feature extraction using generative adversarial networks publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2018.10.019 – volume: 39 start-page: 209 issue: 2 year: 2016 ident: 10.1016/j.patcog.2024.111101_b2 article-title: A comprehensive study on cross-view gait based human identification with deep CNNs publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2016.2545669 – volume: 13 start-page: 734 issue: 7 year: 2021 ident: 10.1016/j.patcog.2024.111101_b42 article-title: Attention-based deep learning networks for identification of human gait using radar micro-Doppler spectrograms publication-title: Int. J. Microw. Wireless Technol. doi: 10.1017/S1759078721000830 – ident: 10.1016/j.patcog.2024.111101_b46 doi: 10.1109/MetroAeroSpace48742.2020.9160199 – ident: 10.1016/j.patcog.2024.111101_b31 doi: 10.1109/ICASSP49357.2023.10095141 – volume: 26 start-page: 117 year: 2023 ident: 10.1016/j.patcog.2024.111101_b33 article-title: Human gait recognition based on frontal-view sequences using gait dynamics and deep learning publication-title: IEEE Trans. Multimed. doi: 10.1109/TMM.2023.3262131 – ident: 10.1016/j.patcog.2024.111101_b48 – volume: 27 start-page: 1813 issue: 4 year: 2023 ident: 10.1016/j.patcog.2024.111101_b9 article-title: The human activity radar challenge: Benchmarking based on the ‘radar signatures of human activities’ dataset from Glasgow university publication-title: IEEE J. Biomed. Health Inform. doi: 10.1109/JBHI.2023.3240895 – volume: 12 start-page: 2237 issue: 14 year: 2020 ident: 10.1016/j.patcog.2024.111101_b12 article-title: Hierarchical radar data analysis for activity and personnel recognition publication-title: Remote Sens. doi: 10.3390/rs12142237 – ident: 10.1016/j.patcog.2024.111101_b20 – volume: 29 start-page: 366 issue: 5 year: 2019 ident: 10.1016/j.patcog.2024.111101_b45 article-title: Person identification using micro-Doppler signatures of human motions and UWB radar publication-title: IEEE Microw. Wirel. Compon. Lett. doi: 10.1109/LMWC.2019.2907547 – volume: 29 start-page: 1001 year: 2019 ident: 10.1016/j.patcog.2024.111101_b1 article-title: Cross-view gait recognition by discriminative feature learning publication-title: IEEE Trans. Image Process. doi: 10.1109/TIP.2019.2926208 – volume: 15 start-page: 669 issue: 5 year: 2018 ident: 10.1016/j.patcog.2024.111101_b18 article-title: Personnel recognition and gait classification based on multistatic micro-Doppler signatures using deep convolutional neural networks publication-title: IEEE Geosci. Remote Sens. Lett. doi: 10.1109/LGRS.2018.2806940 – volume: 23 start-page: 23927 issue: 19 year: 2023 ident: 10.1016/j.patcog.2024.111101_b6 article-title: Dual-task human activity sensing for pose reconstruction and action recognition using 4D imaging radar publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2023.3308788 – volume: 15 start-page: 578 year: 2019 ident: 10.1016/j.patcog.2024.111101_b26 article-title: Attention-based two-stream convolutional networks for face spoofing detection publication-title: IEEE Trans. Inf. Forensics Secur. doi: 10.1109/TIFS.2019.2922241 – volume: 44 start-page: 345 issue: 1 year: 2022 ident: 10.1016/j.patcog.2024.111101_b30 article-title: On learning disentangled representations for gait recognition publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2020.2998790 – ident: 10.1016/j.patcog.2024.111101_b13 doi: 10.1109/RadarConf2147009.2021.9455218 – volume: 14 start-page: 38 issue: 1 year: 2016 ident: 10.1016/j.patcog.2024.111101_b5 article-title: Drone classification using convolutional neural networks with merged Doppler images publication-title: IEEE Geosci. Remote Sens. Lett. doi: 10.1109/LGRS.2016.2624820 – ident: 10.1016/j.patcog.2024.111101_b54 doi: 10.1145/3664647.3680820 – volume: 54 start-page: 441 issue: 7 year: 2018 ident: 10.1016/j.patcog.2024.111101_b16 article-title: Detection of multiple micro-drones via cadence velocity diagram analysis publication-title: Electron. Lett. doi: 10.1049/el.2017.4317 – volume: 66 start-page: 2629 issue: 9 year: 2019 ident: 10.1016/j.patcog.2024.111101_b11 article-title: Toward unobtrusive in-home gait analysis based on radar micro-Doppler signatures publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2019.2893528 – volume: 255 year: 2024 ident: 10.1016/j.patcog.2024.111101_b47 article-title: Progressively-orthogonally-mapped EfficientNet for action recognition on time-range-Doppler signature publication-title: Expert Syst. Appl. doi: 10.1016/j.eswa.2024.124824 – volume: 85 start-page: 60 year: 2019 ident: 10.1016/j.patcog.2024.111101_b15 article-title: Open-set human activity recognition based on micro-Doppler signatures publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2018.07.030 – ident: 10.1016/j.patcog.2024.111101_b19 doi: 10.1117/12.2501770 – volume: 205 year: 2022 ident: 10.1016/j.patcog.2024.111101_b53 article-title: Cross-document attention-based gated fusion network for automated medical licensing exam publication-title: Expert Syst. Appl. doi: 10.1016/j.eswa.2022.117588 |
SSID | ssj0017142 |
Score | 2.4683917 |
Snippet | Video-based gait recognition suffers from potential privacy issues and performance degradation due to dim environments, partial occlusions, or camera view... |
SourceID | crossref elsevier |
SourceType | Index Database Publisher |
StartPage | 111101 |
SubjectTerms | Asymmetric Attention Fusion Cadence velocity diagram Micro-Doppler signature Radar gait recognition Spectrogram |
Title | Radar gait recognition using Dual-branch Swin Transformer with Asymmetric Attention Fusion |
URI | https://dx.doi.org/10.1016/j.patcog.2024.111101 |
Volume | 159 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07T8MwELaqsrDwRjwrD6ymqeO8xqhQFRAdoJUqlugc26WIplVJhVj47fjyqEBIDKxRbFlf7PPd5bv7CLnwwPWVgYDZu0QxYYzPIBT4Dx58Ba4xQmO-437g90fiduyNG6Rb18IgrbKy_aVNL6x19aRdodleTKdY44ttBx3sKGf9Bo6F5kIEuMsvP9c0D9T3LjuGux2Gb9flcwXHa2HN3Xxio0QuCttRScP8up6-XTm9HbJV-Yo0LpezSxo62yPbtQ4DrY7lPnl6AAVLOoFpTteEoHlGkdM-oVcreGUS9TOe6eP7NKPD2le1c2AalsZvH7MZKmulNM7zkv9IeyvMox2QUe962O2zSjOBpfao5sxVgdEcIhMobWMHZRzJNYA2MlIeONIJU-lJG8MB94w9rErJwGJnfC0jYSxKh6SZzTN9hKQnrngk0hRbwHlhx84iUdFT6iBSKoRjwmqokkXZGiOpOWMvSQltgtAmJbTHJKjxTH584sRa7z9Hnvx75CnZ5CjYW5DGzkgzX670ufUictkqtkmLbMQ3d_3BF8rlyRQ |
linkProvider | Elsevier |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1NT8IwGG4QD3rx24ifPXitjK7b2JGgBBU4KCTES9OuLc7IIDhivPjb7bsPojHx4HVZm-XZ3q_ued8HoUtPuL4yIiA2lijCjPGJaDL4By98JVxjmIbzjv7A747Y3dgbV1C77IUBWmXh-3Ofnnnr4kq9QLM-j2Po8YWxgw5MlLN5A_XW0Dqz5gsyBlefK54HCHznI8PdBoHby_65jOQ1t_5uNrFlImWZ8yi0YX7Fp28xp7ODtopkEbfy59lFFZ3soe1SiAEXdrmPnh6EEgs8EXGKV4ygWYKB1D7B10vxSiQIaDzjx_c4wcMyWbV7wDksbr19TKcgrRXhVprmBEjcWcJB2gEadW6G7S4pRBNIZG01Ja4KjKYiNIHStnhQxpFUC6GNDJUnHOk0I-lJW8QJ6hlrrUrJwIJnfC1DZixKh6iazBJ9BKwnqmjIoghmwHnNht1FgqSn1EGoVFPUECmh4vN8NgYvSWMvPIeWA7Q8h7aGghJP_uMdc-u-_1x5_O-VF2ijO-z3eO92cH-CNimo92YMslNUTRdLfWZTilSeZ5_MF5iZyqI |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Radar+gait+recognition+using+Dual-branch+Swin+Transformer+with+Asymmetric+Attention+Fusion&rft.jtitle=Pattern+recognition&rft.au=He%2C+Wentao&rft.au=Ren%2C+Jianfeng&rft.au=Bai%2C+Ruibin&rft.au=Jiang%2C+Xudong&rft.date=2025-03-01&rft.pub=Elsevier+Ltd&rft.issn=0031-3203&rft.volume=159&rft_id=info:doi/10.1016%2Fj.patcog.2024.111101&rft.externalDocID=S0031320324008525 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0031-3203&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0031-3203&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0031-3203&client=summon |