FMCW Radar-Based Hand Gesture Recognition Using Spatiotemporal Deformable and Context-Aware Convolutional 5-D Feature Representation

Recently, frequency-modulated continuous-wave (FMCW) radar-based hand gesture recognition (HGR) using deep learning has achieved favorable performance. However, many existing methods use extracted features separately, i.e., using one of the range, Doppler, azimuth, or elevation angle information, or...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; pp. 1 - 11
Main Authors Dong, Xichao, Zhao, Zewei, Wang, Yupei, Zeng, Tao, Wang, Jianping, Sui, Yi
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Recently, frequency-modulated continuous-wave (FMCW) radar-based hand gesture recognition (HGR) using deep learning has achieved favorable performance. However, many existing methods use extracted features separately, i.e., using one of the range, Doppler, azimuth, or elevation angle information, or a combination of any two, to train convolutional neural networks (CNNs), which ignore the interrelation among the 5-D time-varying-range-Doppler-azimuth-elevation feature space. Although there have been methods using the 5-D information, their mining of the interrelation among the 5-D feature space is not sufficient, and there is still room for improvements. This article proposes a new processing scheme of HGR based on 5-D feature cubes that are jointly encoded by a 3-D fast Fourier transform (3-D-FFT)-based method. Then, a CNN is proposed by building two novel blocks, i.e., the spatiotemporal deformable convolution (STDC) block and the adaptive spatiotemporal context-aware convolution (ASTCAC) block. Concretely, STDC is designed to cope with hand gestures' large spatiotemporal geometric transformations in the 5-D feature space. Moreover, ASTCAC is designed for modeling long-distance global relationships, e.g., relationships between pixels of the feature at the upper left corner and lower right corner, and exploring the global spatiotemporal context, in order to enhance the target feature representation and suppress interference. Finally, our presented method is verified on a large radar dataset, including 19 760 sets of 16 common hand gestures, collected by 19 subjects. Our method obtains a recognition rate of 99.53% on the validation dataset and that of 97.22% on the test dataset, which is significantly better than state-of-the-art methods.
AbstractList Recently, frequency-modulated continuous-wave (FMCW) radar-based hand gesture recognition (HGR) using deep learning has achieved favorable performance. However, many existing methods use extracted features separately, i.e., using one of the range, Doppler, azimuth, or elevation angle information, or a combination of any two, to train convolutional neural networks (CNNs), which ignore the interrelation among the 5-D time-varying-range-Doppler-azimuth-elevation feature space. Although there have been methods using the 5-D information, their mining of the interrelation among the 5-D feature space is not sufficient, and there is still room for improvements. This article proposes a new processing scheme of HGR based on 5-D feature cubes that are jointly encoded by a 3-D fast Fourier transform (3-D-FFT)-based method. Then, a CNN is proposed by building two novel blocks, i.e., the spatiotemporal deformable convolution (STDC) block and the adaptive spatiotemporal context-aware convolution (ASTCAC) block. Concretely, STDC is designed to cope with hand gestures’ large spatiotemporal geometric transformations in the 5-D feature space. Moreover, ASTCAC is designed for modeling long-distance global relationships, e.g., relationships between pixels of the feature at the upper left corner and lower right corner, and exploring the global spatiotemporal context, in order to enhance the target feature representation and suppress interference. Finally, our presented method is verified on a large radar dataset, including 19 760 sets of 16 common hand gestures, collected by 19 subjects. Our method obtains a recognition rate of 99.53% on the validation dataset and that of 97.22% on the test dataset, which is significantly better than state-of-the-art methods.
Author Dong, Xichao
Zeng, Tao
Sui, Yi
Wang, Jianping
Wang, Yupei
Zhao, Zewei
Author_xml – sequence: 1
  givenname: Xichao
  orcidid: 0000-0001-8624-8872
  surname: Dong
  fullname: Dong, Xichao
  organization: School of Information and Electronics, Beijing Institute of Technology, Beijing, China
– sequence: 2
  givenname: Zewei
  orcidid: 0000-0003-4893-020X
  surname: Zhao
  fullname: Zhao, Zewei
  organization: School of Information and Electronics, Beijing Institute of Technology, Beijing, China
– sequence: 3
  givenname: Yupei
  orcidid: 0000-0002-9771-6229
  surname: Wang
  fullname: Wang, Yupei
  email: wangyupei2019@outlook.com
  organization: School of Information and Electronics, Beijing Institute of Technology, Beijing, China
– sequence: 4
  givenname: Tao
  orcidid: 0000-0002-2513-6014
  surname: Zeng
  fullname: Zeng, Tao
  organization: School of Information and Electronics, Beijing Institute of Technology, Beijing, China
– sequence: 5
  givenname: Jianping
  orcidid: 0000-0001-9450-8961
  surname: Wang
  fullname: Wang, Jianping
  organization: Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS), Delft University of Technology, Delft, The Netherlands
– sequence: 6
  givenname: Yi
  orcidid: 0000-0002-3174-0015
  surname: Sui
  fullname: Sui, Yi
  organization: School of Information and Electronics, Beijing Institute of Technology, Beijing, China
BookMark eNp9kEtPwzAQhC0EEuXxAxAXS5xTvHYcx0cotEUCIRUQx8iNNygotYOd8rjzw0nUigMHTquV5pvdmQOy67xDQk6AjQGYPn-cLR7GnHEYC-BcCL5DRiBlnrAsTXfJiIHOEp5rvk8OYnxlDFIJakS-p3eTZ7ow1oTk0kS0dG6cpTOM3TogXWDpX1zd1d7Rp1i7F_rQmn7rcNX6YBp6hZUPK7NskA7cxLsOP7vk4sP0dL-9-2Y90L1UJld0imbr2waM6LrBzB2Rvco0EY-385A8Ta8fJ_Pk9n52M7m4TUquRZfIFC0gWJtrVjEtLJepxZIvMVU5Y8uMKVVVBriqSqMhLzOtMsU1twYBZCYOydnGtw3-bd1HLF79OvS_xYJnXOUiByF7FWxUZfAxBqyKNtQrE74KYMVQdjGUXQxlF9uye0b9Ycp6k60Lpm7-JU83ZI2Iv5e0zFPNpfgBQnmPbw
CODEN IGRSD2
CitedBy_id crossref_primary_10_1109_OJCOMS_2024_3411529
crossref_primary_10_1109_TIM_2024_3412196
crossref_primary_10_1145_3631433
crossref_primary_10_3390_s23177478
crossref_primary_10_1109_LGRS_2022_3217390
crossref_primary_10_1109_JSEN_2022_3216604
crossref_primary_10_1109_TIM_2023_3253906
crossref_primary_10_3390_electronics11010156
crossref_primary_10_1109_ACCESS_2023_3311265
crossref_primary_10_1109_TGRS_2022_3213748
crossref_primary_10_1109_TGRS_2023_3276023
crossref_primary_10_1109_TAP_2024_3373054
crossref_primary_10_1109_TMC_2024_3402356
crossref_primary_10_1109_JSEN_2024_3432972
crossref_primary_10_1109_JSEN_2022_3169231
crossref_primary_10_1109_JSEN_2024_3505145
crossref_primary_10_1109_JSEN_2024_3355395
Cites_doi 10.1109/CVPR.2018.00054
10.1109/VTC2020-Spring48590.2020.9128573
10.1109/JSEN.2020.2994292
10.1109/JSEN.2018.2808688
10.1109/TAP.1986.1143830
10.1109/TIP.2017.2785279
10.23919/APMC.2018.8617375
10.1109/LSP.2020.3013518
10.1109/CVPR.2014.223
10.1109/34.735811
10.1145/2897824.2925953
10.1109/GlobalSIP.2016.7905995
10.1007/978-3-030-01267-0_19
10.1007/978-3-030-01258-8_21
10.1109/CVPR42600.2020.00342
10.1109/CVPR.2018.00685
10.1109/JSEN.2020.3046991
10.1109/TSMCA.2011.2116004
10.23919/EURAD.2017.8249172
10.1007/978-3-319-46493-038
10.1109/TPAMI.2012.59
10.1109/TMTT.2020.3031619
10.1109/CVPR.2017.502
10.1109/ICASSP.2019.8682277
10.3390/s17040833
10.1049/cp.2017.0482
10.1109/CVPR.2018.00745
10.1109/ICCV.2017.89
10.1109/JSEN.2019.2892073
10.1109/EuCAP.2012.6206605
10.1016/j.patcog.2020.107416
10.1109/CVPR.2017.195
10.1109/ACCESS.2019.2897060
10.1109/TGRS.2020.3010880
10.1109/TMM.2017.2759504
10.1109/ACCESS.2019.2942305
10.1109/EMBC.2014.6945096
10.1109/MSP.2018.2890128
10.1162/neco.1997.9.8.1735
10.1109/MSP.2019.2903715
10.1109/RADAR42522.2020.9114664
10.1007/978-3-030-58595-2_46
10.1109/ICCV.2015.510
10.1109/IIH-MSP.2009.96
10.1109/LSENS.2018.2866371
10.1109/TIP.2020.3042059
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
RIA
RIE
AAYXX
CITATION
7UA
8FD
C1K
F1W
FR3
H8D
H96
KR7
L.G
L7M
DOI 10.1109/TGRS.2021.3122332
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL) - NZ
CrossRef
Water Resources Abstracts
Technology Research Database
Environmental Sciences and Pollution Management
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Aerospace Database
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Aerospace Database
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Technology Research Database
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
Water Resources Abstracts
Environmental Sciences and Pollution Management
DatabaseTitleList Aerospace Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE/IET Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Physics
EISSN 1558-0644
EndPage 11
ExternalDocumentID 10_1109_TGRS_2021_3122332
9584925
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61960206009
  funderid: 10.13039/501100001809
– fundername: Special Fund for Research on National Major Research Instruments (NSFC)
  grantid: 61827901; 31727901
– fundername: China Postdoctoral Science Foundation
  grantid: 2020M670162
  funderid: 10.13039/501100002858
– fundername: Distinguished Young Scholars of Chongqing
  grantid: cstc2020jcyj-jqX0008
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
AENEX
AETIX
AFRAH
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
RXW
TAE
TN5
VH1
Y6R
AAYOK
AAYXX
CITATION
RIG
7UA
8FD
C1K
F1W
FR3
H8D
H96
KR7
L.G
L7M
ID FETCH-LOGICAL-c293t-54ed1e1dd890f093d254dec2be47800b6077ffa127fca918c69767292dae11563
IEDL.DBID RIE
ISSN 0196-2892
IngestDate Mon Jun 30 08:18:03 EDT 2025
Thu Apr 24 22:57:30 EDT 2025
Tue Jul 01 01:34:32 EDT 2025
Wed Aug 27 02:49:40 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c293t-54ed1e1dd890f093d254dec2be47800b6077ffa127fca918c69767292dae11563
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-3174-0015
0000-0002-9771-6229
0000-0001-8624-8872
0000-0003-4893-020X
0000-0001-9450-8961
0000-0002-2513-6014
PQID 2627838135
PQPubID 85465
PageCount 11
ParticipantIDs ieee_primary_9584925
proquest_journals_2627838135
crossref_primary_10_1109_TGRS_2021_3122332
crossref_citationtrail_10_1109_TGRS_2021_3122332
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20220000
2022-00-00
20220101
PublicationDateYYYYMMDD 2022-01-01
PublicationDate_xml – year: 2022
  text: 20220000
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on geoscience and remote sensing
PublicationTitleAbbrev TGRS
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
ref15
ref14
(ref49) 2017
ref52
ref11
ref10
Wang (ref20) 2019; 41
Kuehne (ref51)
ref17
ref16
ref19
ref18
ref50
ref45
ref47
ref42
ref41
ref44
ref43
ref8
Jankiraman (ref48) 2018
ref9
ref4
ref3
ref6
ref5
ref40
ref35
ref34
ref37
ref36
ref31
ref30
ref33
ref32
ref2
ref1
ref39
ref38
ref24
ref23
ref26
ref25
ref22
ref21
ref28
ref27
Ming (ref7) 2019; 10
ref29
Brabandere (ref46)
References_xml – ident: ref30
  doi: 10.1109/CVPR.2018.00054
– start-page: 667
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref46
  article-title: Dynamic filter networks
– ident: ref15
  doi: 10.1109/VTC2020-Spring48590.2020.9128573
– volume-title: FMCW Radar Design
  year: 2018
  ident: ref48
– ident: ref23
  doi: 10.1109/JSEN.2020.2994292
– ident: ref11
  doi: 10.1109/JSEN.2018.2808688
– ident: ref31
  doi: 10.1109/TAP.1986.1143830
– ident: ref28
  doi: 10.1109/TIP.2017.2785279
– ident: ref14
  doi: 10.23919/APMC.2018.8617375
– ident: ref42
  doi: 10.1109/LSP.2020.3013518
– ident: ref50
  doi: 10.1109/CVPR.2014.223
– ident: ref4
  doi: 10.1109/34.735811
– ident: ref10
  doi: 10.1145/2897824.2925953
– ident: ref25
  doi: 10.1109/GlobalSIP.2016.7905995
– ident: ref38
  doi: 10.1007/978-3-030-01267-0_19
– volume: 10
  start-page: 1
  issue: 1
  year: 2019
  ident: ref7
  article-title: A review of hand gesture and sign language recognition techniques
  publication-title: Int. J. Mach. Learn. Cybern.
– ident: ref40
  doi: 10.1007/978-3-030-01258-8_21
– ident: ref41
  doi: 10.1109/CVPR42600.2020.00342
– ident: ref36
  doi: 10.1109/CVPR.2018.00685
– ident: ref18
  doi: 10.1109/JSEN.2020.3046991
– ident: ref9
  doi: 10.1109/TSMCA.2011.2116004
– ident: ref13
  doi: 10.23919/EURAD.2017.8249172
– ident: ref37
  doi: 10.1007/978-3-319-46493-038
– ident: ref33
  doi: 10.1109/TPAMI.2012.59
– ident: ref2
  doi: 10.1109/TMTT.2020.3031619
– ident: ref34
  doi: 10.1109/CVPR.2017.502
– ident: ref21
  doi: 10.1109/ICASSP.2019.8682277
– ident: ref6
  doi: 10.3390/s17040833
– ident: ref12
  doi: 10.1049/cp.2017.0482
– ident: ref45
  doi: 10.1109/CVPR.2018.00745
– ident: ref29
  doi: 10.1109/ICCV.2017.89
– ident: ref16
  doi: 10.1109/JSEN.2019.2892073
– ident: ref32
  doi: 10.1109/EuCAP.2012.6206605
– ident: ref44
  doi: 10.1016/j.patcog.2020.107416
– volume: 41
  start-page: 822
  issue: 4
  year: 2019
  ident: ref20
  article-title: Gesture recognition with multi-dimensional parameter using FMCW radar
  publication-title: J. Electron. Inf. Technol.
– ident: ref52
  doi: 10.1109/CVPR.2017.195
– ident: ref19
  doi: 10.1109/ACCESS.2019.2897060
– ident: ref8
  doi: 10.1109/TGRS.2020.3010880
– ident: ref39
  doi: 10.1109/TMM.2017.2759504
– ident: ref24
  doi: 10.1109/ACCESS.2019.2942305
– ident: ref5
  doi: 10.1109/EMBC.2014.6945096
– ident: ref1
  doi: 10.1109/MSP.2018.2890128
– ident: ref27
  doi: 10.1162/neco.1997.9.8.1735
– ident: ref3
  doi: 10.1109/MSP.2019.2903715
– ident: ref22
  doi: 10.1109/RADAR42522.2020.9114664
– ident: ref47
  doi: 10.1007/978-3-030-58595-2_46
– ident: ref35
  doi: 10.1109/ICCV.2015.510
– volume-title: Robust Traffic and Intersection Monitoring Using Millimeter Wave Sensors
  year: 2017
  ident: ref49
– ident: ref26
  doi: 10.1109/IIH-MSP.2009.96
– ident: ref17
  doi: 10.1109/LSENS.2018.2866371
– ident: ref43
  doi: 10.1109/TIP.2020.3042059
– start-page: 571
  volume-title: Proc. HLRS
  ident: ref51
  article-title: HMDB: A large video database for human motion recognition
SSID ssj0014517
Score 2.5056732
Snippet Recently, frequency-modulated continuous-wave (FMCW) radar-based hand gesture recognition (HGR) using deep learning has achieved favorable performance....
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1
SubjectTerms Artificial neural networks
Azimuth
Concrete blocks
Context
Continuous radiation
Convolution
Cubes
Datasets
Deep learning
Deformation
Doppler effect
Doppler sonar
Elevation angle
Estimation
Fast Fourier transformations
Feature extraction
Formability
Fourier transforms
Frequency dependence
Frequency-modulated continuous-wave (FMCW) radar
Geometric transformation
Gesture recognition
hand gesture recognition (HGR)
Machine learning
Methods
Neural networks
Radar
Representations
spatiotemporal context modeling
spatiotemporal deformable convolution (STDC)
Spatiotemporal phenomena
Three-dimensional displays
Title FMCW Radar-Based Hand Gesture Recognition Using Spatiotemporal Deformable and Context-Aware Convolutional 5-D Feature Representation
URI https://ieeexplore.ieee.org/document/9584925
https://www.proquest.com/docview/2627838135
Volume 60
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT9swFH4CJKTtsA3YtDI2-bDThEvsJE58ZLBSIXWHAoJb5Ngvl1UpKu2QOPOH79lxCww07ZZIfpal9-z3ffb7AfBVYGYIVRvekO_mWVMKTqg14Xlqa9pf3kX6BOfRTzW8yE6v8qs12F_lwiBiCD7Dvv8Mb_luahf-quxAk7fUMl-HdSJuXa7W6sUgy0VMjVacSISML5gi0QfnJ-MzYoJSEEElb5jKJz4oNFV5dhIH9zJ4C6Plwrqokl_9xbzu27u_ajb-78rfwZuIM9lhZxhbsIbtNrx-VH1wGzZD9Ke92YH7wejoko2NMzP-ndyaY0PTOnZC61zMkI2XQUbTloUQA3YW4rBjWasJO8YAfesJMi8XKl4RoT68NSRNf7-jfdPQnB8zDzu7ea8fkp_a93Ax-HF-NOSxPQO3hBHmPM_QCRTOlTppEp064poOrawxKwiG1iopiqYxQhaNNVqUVhH0ISwvnUHCoSr9ABvttMWPwFTtnDYocyxkZlVqTKNEiY3WGR1CxvUgWSqssrF2uW-hMakCh0l05XVceR1XUcc9-LYSue4Kd_xr8I7X2WpgVFcP9pZWUcWtfVNJ5ZuTlCLNd1-W-gSvpM-RCPc0e7Axny3wMyGXef0lmOwfvijqBQ
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LbxMxEB6VIgQceLRFpC3gAyeE07V317s-lpY0QNNDmoreVl579kK0qdIEJM78cMZeJ6WAELddyWNZmrG_b-x5ALwWmBli1YY3hN08a0rBibUmPE9tTfvLQ6RPcB6dqeFF9vEyv9yAt-tcGEQMwWfY95_hLd_N7NJflR1oQkst8ztwl3A_l1221vrNIMtFTI5WnNwIGd8wRaIPJifjc_IFpSAXlfAwlbdQKLRV-eMsDgAzeAyj1dK6uJIv_eWi7tvvv1Vt_N-1P4FHkWmyw840nsIGtlvw8Jf6g1twL8R_2utt-DEYHX1mY-PMnL8jYHNsaFrHTmidyzmy8SrMaNayEGTAzkMkdixsNWXHGMhvPUXm5ULNK3KpD78Zkqa_r9HCaWjOj5knnt28VzfpT-0OXAzeT46GPDZo4JZYwoLnGTqBwrlSJ02iU0fepkMra8wKIqK1SoqiaYyQRWONFqVVRH6IzUtnkJioSp_BZjtr8TkwVTunDcocC5lZlRrTKFFio3VGx5BxPUhWCqtsrF7um2hMq-DFJLryOq68jquo4x68WYtcdaU7_jV42-tsPTCqqwf7K6uo4ua-rqTy7UlKkea7f5d6BfeHk9Fpdfrh7NMePJA-YyLc2uzD5mK-xBfEYxb1y2C-PwEasO1P
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=FMCW+Radar-Based+Hand+Gesture+Recognition+Using+Spatiotemporal+Deformable+and+Context-Aware+Convolutional+5-D+Feature+Representation&rft.jtitle=IEEE+transactions+on+geoscience+and+remote+sensing&rft.au=Dong%2C+Xichao&rft.au=Zhao%2C+Zewei&rft.au=Wang%2C+Yupei&rft.au=Zeng%2C+Tao&rft.date=2022&rft.issn=0196-2892&rft.eissn=1558-0644&rft.volume=60&rft.spage=1&rft.epage=11&rft_id=info:doi/10.1109%2FTGRS.2021.3122332&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TGRS_2021_3122332
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0196-2892&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0196-2892&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0196-2892&client=summon