Human Activity Classification Based on Point Clouds Measured by Millimeter Wave MIMO Radar With Deep Recurrent Neural Networks

We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar system that has a very high angular azimuth and elevation resolution, scatterers from the body can be localized. When precisely represented,...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 21; no. 12; pp. 13522 - 13529
Main Authors Kim, Youngwook, Alnujaim, Ibrahim, Oh, Daegun
Format Journal Article
LanguageEnglish
Published New York IEEE 15.06.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar system that has a very high angular azimuth and elevation resolution, scatterers from the body can be localized. When precisely represented, individual points form a point cloud whose shape resembles that of the human subject. As the subject engages in various activities, the shapes of the point clouds change accordingly. We propose to classify human activities through recognition of point cloud variations. To construct a dataset, we used an FMCW MIMO radar to measure 19 human subjects performing 7 activities. The radar had 12 TXs and 16 RXs, producing a <inline-formula> <tex-math notation="LaTeX">33\times 31 </tex-math></inline-formula> virtual array with approximately 3.5 degrees of angular resolution in azimuth and elevation. To classify human activities, we used a deep recurrent neural network (DRNN) with a two-dimensional convolutional network. The convolutional filters captured point clouds' features at time instance for sequential input into the DRNN, which recognized time-varying signatures, producing a classification accuracy exceeding 97%.
AbstractList We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar system that has a very high angular azimuth and elevation resolution, scatterers from the body can be localized. When precisely represented, individual points form a point cloud whose shape resembles that of the human subject. As the subject engages in various activities, the shapes of the point clouds change accordingly. We propose to classify human activities through recognition of point cloud variations. To construct a dataset, we used an FMCW MIMO radar to measure 19 human subjects performing 7 activities. The radar had 12 TXs and 16 RXs, producing a [Formula Omitted] virtual array with approximately 3.5 degrees of angular resolution in azimuth and elevation. To classify human activities, we used a deep recurrent neural network (DRNN) with a two-dimensional convolutional network. The convolutional filters captured point clouds’ features at time instance for sequential input into the DRNN, which recognized time-varying signatures, producing a classification accuracy exceeding 97%.
We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar system that has a very high angular azimuth and elevation resolution, scatterers from the body can be localized. When precisely represented, individual points form a point cloud whose shape resembles that of the human subject. As the subject engages in various activities, the shapes of the point clouds change accordingly. We propose to classify human activities through recognition of point cloud variations. To construct a dataset, we used an FMCW MIMO radar to measure 19 human subjects performing 7 activities. The radar had 12 TXs and 16 RXs, producing a <inline-formula> <tex-math notation="LaTeX">33\times 31 </tex-math></inline-formula> virtual array with approximately 3.5 degrees of angular resolution in azimuth and elevation. To classify human activities, we used a deep recurrent neural network (DRNN) with a two-dimensional convolutional network. The convolutional filters captured point clouds' features at time instance for sequential input into the DRNN, which recognized time-varying signatures, producing a classification accuracy exceeding 97%.
Author Alnujaim, Ibrahim
Oh, Daegun
Kim, Youngwook
Author_xml – sequence: 1
  givenname: Youngwook
  orcidid: 0000-0002-4067-6254
  surname: Kim
  fullname: Kim, Youngwook
  email: youngkim@csufresno.edu
  organization: Electrical and Computer Engineering Department, California State University, Fresno, CA, USA
– sequence: 2
  givenname: Ibrahim
  orcidid: 0000-0001-5610-0631
  surname: Alnujaim
  fullname: Alnujaim, Ibrahim
  organization: Electrical and Computer Engineering Department, California State University, Fresno, CA, USA
– sequence: 3
  givenname: Daegun
  surname: Oh
  fullname: Oh, Daegun
  organization: Advanced Radar Research Division, Daegu Gyeongbuk Institute of Science and Technology, Daegu, South Korea
BookMark eNp9kE9P3DAQxa0KpMK2H6DqxRLnLB7_iZ0jXRZYxEJFQe0tcpKJappNFtsB7YXPjreLeuDQ07zRe29G-h2SvX7okZAvwKYArDi-_DG_nnLGYSpYboQxH8gBKGUy0NLsbbVgmRT610dyGMIDY1BopQ_Iy8W4sj09qaN7cnFDZ50NwbWuttENPf1mAzY0ie-D62Nyh7EJdIk2jD4Z1YYuXde5FUb09Kd9QrpcLG_orW1s2l38TU8R1_QW69F7TBeucfS2SyM-D_5P-ET2W9sF_Pw2J-T-bH43u8iubs4Xs5OrrOaFiFnLpc7bvDKmBtVUmosGDUoODdTKKNVIxgvJECtW6bYAWzVVCxaKVgFHA2JCjnZ31354HDHE8mEYfZ9ellxJkHkhZJ5SsEvVfgjBY1uuvVtZvymBlVvM5RZzucVcvmFOHf2uU7v4F1701nX_bX7dNR0i_vtUCKMFl-IVihWNSg
CODEN ISJEAZ
CitedBy_id crossref_primary_10_1109_JSEN_2024_3355421
crossref_primary_10_1109_ACCESS_2024_3431692
crossref_primary_10_1109_JSEN_2022_3175618
crossref_primary_10_1109_TIM_2025_3545718
crossref_primary_10_1109_JSEN_2022_3145844
crossref_primary_10_3390_s24020648
crossref_primary_10_1109_JSEN_2022_3212687
crossref_primary_10_3390_rs15082101
crossref_primary_10_1109_JIOT_2023_3235268
crossref_primary_10_3390_signals3020017
crossref_primary_10_1109_JSEN_2021_3118836
crossref_primary_10_1109_JSEN_2024_3452110
crossref_primary_10_1109_JSEN_2024_3505145
crossref_primary_10_1088_1742_6596_2290_1_012059
crossref_primary_10_1109_JMW_2023_3264494
crossref_primary_10_1109_OJEMB_2024_3408078
crossref_primary_10_3390_rs14205177
crossref_primary_10_1109_JSEN_2023_3307390
crossref_primary_10_1109_TCPMT_2024_3352183
crossref_primary_10_1016_j_seta_2022_102910
crossref_primary_10_1109_TIM_2023_3302936
crossref_primary_10_1109_TIM_2024_3366575
crossref_primary_10_1109_OJAP_2023_3279090
crossref_primary_10_1109_JSEN_2023_3283778
crossref_primary_10_1109_OJCOMS_2024_3411529
crossref_primary_10_1109_JIOT_2023_3235808
crossref_primary_10_1109_JSEN_2024_3415078
crossref_primary_10_1109_TIM_2023_3298408
crossref_primary_10_1109_JSEN_2022_3203154
crossref_primary_10_1109_TGRS_2022_3189746
crossref_primary_10_1109_JSEN_2024_3485106
crossref_primary_10_3390_rs13183791
crossref_primary_10_1109_JSEN_2025_3535673
crossref_primary_10_1109_TRS_2025_3539289
crossref_primary_10_3390_rs16091522
crossref_primary_10_1109_TRS_2023_3341230
crossref_primary_10_3390_electronics12234785
crossref_primary_10_1109_LSENS_2023_3336793
crossref_primary_10_1109_JSEN_2024_3496552
crossref_primary_10_3390_s22239401
crossref_primary_10_1049_rsn2_12320
crossref_primary_10_3390_make5040075
crossref_primary_10_1109_JSEN_2021_3117942
crossref_primary_10_1109_JSEN_2022_3210579
crossref_primary_10_1088_1361_6501_ac849c
crossref_primary_10_1109_MCOM_011_2200580
crossref_primary_10_3390_app142210764
crossref_primary_10_1109_TMTT_2022_3200097
crossref_primary_10_31436_iiumej_v26i1_3268
Cites_doi 10.1109/TAES.2017.2740098
10.1109/IRS.2012.6233325
10.1049/iet-rsn.2015.0119
10.1201/9781315155340
10.1109/TMTT.2016.2597824
10.1109/LGRS.2019.2930636
10.1109/IEEECONF44664.2019.9048939
10.1109/ACCESS.2020.2971064
10.23919/EuCAP48036.2020.9135381
10.3390/s16121990
10.1109/TGRS.2009.2012849
10.1109/ISMICT.2016.7498911
10.1109/JSEN.2020.3006386
10.1002/047084535X
10.1109/LGRS.2019.2919770
10.1049/el:20060355
10.1109/ACCESS.2017.2778011
10.23919/ICIF.2018.8455344
10.23919/EURAD.2017.8249173
10.1109/LGRS.2020.2980320
10.1109/JSEN.2018.2808688
10.1109/CVPR.2015.7299101
10.1109/LGRS.2014.2311819
10.1109/RADAR.2016.7485147
10.1109/LGRS.2015.2491329
10.1109/LGRS.2009.2038728
10.1109/TGRS.2019.2907277
10.1109/LSP.2013.2289740
10.1109/MSP.2018.2890128
10.1049/iet-rsn.2013.0165
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
7U5
8FD
L7M
DOI 10.1109/JSEN.2021.3068388
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Solid State and Superconductivity Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Solid State and Superconductivity Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList Solid State and Superconductivity Abstracts

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Geography
Engineering
EISSN 1558-1748
EndPage 13529
ExternalDocumentID 10_1109_JSEN_2021_3068388
9387324
Genre orig-research
GrantInformation_xml – fundername: Daegu Gyeongbuk Institute of Science and Technology (DGIST) Research and Development Program of the Ministry of Science and Information and Communications Technology
  grantid: 19-ST-01
  funderid: 10.13039/501100010274
GroupedDBID -~X
0R~
29I
4.4
5GY
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AGQYO
AHBIQ
AJQPL
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
EBS
F5P
HZ~
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
TWZ
AAYXX
CITATION
7SP
7U5
8FD
L7M
ID FETCH-LOGICAL-c293t-f2476f6b88c15db723de8e421d1c5855d402940eeb0b7f91abdbf1a19f512e813
IEDL.DBID RIE
ISSN 1530-437X
IngestDate Mon Jun 30 10:19:38 EDT 2025
Tue Jul 01 03:37:02 EDT 2025
Thu Apr 24 22:59:31 EDT 2025
Wed Aug 27 02:50:50 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 12
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c293t-f2476f6b88c15db723de8e421d1c5855d402940eeb0b7f91abdbf1a19f512e813
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-5610-0631
0000-0002-4067-6254
PQID 2541469346
PQPubID 75733
PageCount 8
ParticipantIDs ieee_primary_9387324
crossref_primary_10_1109_JSEN_2021_3068388
crossref_citationtrail_10_1109_JSEN_2021_3068388
proquest_journals_2541469346
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-06-15
PublicationDateYYYYMMDD 2021-06-15
PublicationDate_xml – month: 06
  year: 2021
  text: 2021-06-15
  day: 15
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE sensors journal
PublicationTitleAbbrev JSEN
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref34
ref12
ref15
ref14
ref31
ref30
ref33
ref11
ref32
ref10
ref2
ref1
ref17
ref16
ref19
ref18
chen (ref26) 2010
feng (ref21) 2019
ref24
ref23
ref25
sengupta (ref20) 2020
ref22
kim (ref5) 2009; 47
ref28
ref29
ref8
ref7
ref9
ref4
ref3
ref6
qi (ref27) 2017
References_xml – ident: ref10
  doi: 10.1109/TAES.2017.2740098
– start-page: 732
  year: 2020
  ident: ref20
  article-title: MmWave radar point cloud segmentation using GMM in multimodal traffic monitoring
  publication-title: Proc IEEE Int Radar Conf (RADAR)
– ident: ref17
  doi: 10.1109/IRS.2012.6233325
– ident: ref8
  doi: 10.1049/iet-rsn.2015.0119
– ident: ref1
  doi: 10.1201/9781315155340
– ident: ref15
  doi: 10.1109/TMTT.2016.2597824
– ident: ref11
  doi: 10.1109/LGRS.2019.2930636
– ident: ref18
  doi: 10.1109/IEEECONF44664.2019.9048939
– year: 2010
  ident: ref26
  publication-title: Introduction to Direction-of-Arrival Estimation
– ident: ref30
  doi: 10.1109/ACCESS.2020.2971064
– ident: ref23
  doi: 10.23919/EuCAP48036.2020.9135381
– ident: ref13
  doi: 10.3390/s16121990
– volume: 47
  start-page: 1328
  year: 2009
  ident: ref5
  article-title: Human activity classification based on micro-Doppler signatures using a support vector machine
  publication-title: IEEE Trans Geosci Remote Sens
  doi: 10.1109/TGRS.2009.2012849
– ident: ref16
  doi: 10.1109/ISMICT.2016.7498911
– ident: ref31
  doi: 10.1109/JSEN.2020.3006386
– ident: ref28
  doi: 10.1002/047084535X
– ident: ref14
  doi: 10.1109/LGRS.2019.2919770
– ident: ref22
  doi: 10.1049/el:20060355
– ident: ref34
  doi: 10.1109/ACCESS.2017.2778011
– ident: ref19
  doi: 10.23919/ICIF.2018.8455344
– ident: ref29
  doi: 10.23919/EURAD.2017.8249173
– ident: ref12
  doi: 10.1109/LGRS.2020.2980320
– ident: ref4
  doi: 10.1109/JSEN.2018.2808688
– start-page: 1
  year: 2019
  ident: ref21
  article-title: Point cloud segmentation with a high-resolution automotive radar
  publication-title: Proc AmE-Automot Meets Electron 10th GMM-Symp
– ident: ref33
  doi: 10.1109/CVPR.2015.7299101
– ident: ref6
  doi: 10.1109/LGRS.2014.2311819
– ident: ref3
  doi: 10.1109/RADAR.2016.7485147
– ident: ref9
  doi: 10.1109/LGRS.2015.2491329
– ident: ref24
  doi: 10.1109/LGRS.2009.2038728
– ident: ref32
  doi: 10.1109/TGRS.2019.2907277
– start-page: 652
  year: 2017
  ident: ref27
  article-title: PointNet: Deep learning on point sets for 3D classification and segmentation
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
– ident: ref25
  doi: 10.1109/LSP.2013.2289740
– ident: ref2
  doi: 10.1109/MSP.2018.2890128
– ident: ref7
  doi: 10.1049/iet-rsn.2013.0165
SSID ssj0019757
Score 2.5371544
Snippet We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 13522
SubjectTerms Angular resolution
Azimuth
Classification
deep convolutional neural networks
deep recurrent neural networks
Feature extraction
FMCW radar
Human activity classification
Human performance
Human subjects
Millimeter wave radar
Millimeter waves
MIMO radar
Neural networks
point clouds
Radar
Radar antennas
Radar equipment
Radar measurements
Recurrent neural networks
Shape
Three-dimensional displays
Title Human Activity Classification Based on Point Clouds Measured by Millimeter Wave MIMO Radar With Deep Recurrent Neural Networks
URI https://ieeexplore.ieee.org/document/9387324
https://www.proquest.com/docview/2541469346
Volume 21
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61vQAHHi2oCwX5wAmRbfxIHB8LtCqVsqBCxd4iPybqCtit2CxSOfDbGTvZBQFC3GzFjix9Y3tmPDMfwFP0pi2Etpn1CjPFsaVzUNiMl1ZqV3jnZUwUrifl6YU6mxbTLXi-yYVBxBR8huPYTG_5YeFX0VV2aGSlSQHYhm0y3Ppcrc2LgdGpqidt4DxTUk-HF0yem8Ozd8cTsgQFH5N-XMlEsvLzDkqkKn-cxOl6ObkD9XphfVTJx_Gqc2P_7beajf-78rtwe9Az2VEvGPdgC-e7cOuX6oO7cGMgQL-83oPvyZnPjnxPJsESV2aMIkrAsRd01wVGjbeL2byjr4tVWLK69y8G5q5ZzCmcfY6xNeyD_Yqsfl2_Yec2WOrPukv2CvGKnUfnfiwHxWJNEFrepA9CX96Hi5Pj9y9Ps4GaIfOkH3RZK5Qu29JVledFcFrIgBUqwQP3ZIAUgcxSo3JElzvdGm5dcC23nASDC6y4fAA788Uc94FJJJ2VRnJdosotAeSNlh61d7yyohhBvgar8UPd8kif8alJ9ktumohvE_FtBnxH8Gwz5aov2vGvwXsRr83AAaoRHKwlohm29bIRkTS9NFKVD_8-6xHcjP-OsWS8OICd7ssKH5PW0rknSVx_AI-f6WY
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VcigceLQgFgr4wAmRbRw7cXws0GpbmgWVVuwt8mOiroDdis0ilQO_nbGTXRAgxC1RbMXSN_Y8PDMfwDN0uskzZRLjJCaSY0PnYGYSXhihbO6sE6FQuBoXo3N5PMknG_BiXQuDiDH5DIfhMd7l-7lbhlDZnhalIgPgGlwnvZ_zrlprfWegVezrSVs4TaRQk_4Ok6d67_j9wZh8wYwPyUIuRaRZ-amFIq3KH2dxVDCHt6FaLa3LK_k4XLZ26L791rXxf9d-B271libb70TjLmzgbBtu_tJ_cBu2egr0i6sd-B7D-WzfdXQSLLJlhjyiCB17SdrOM3p4N5_OWvo6X_oFq7oIo2f2ioWqwunnkF3DPpivyKqj6i07Nd7Q-7S9YK8RL9lpCO-HhlAsdAWh5Y27NPTFPTg_PDh7NUp6cobEkYXQJk0mVdEUtiwdz71VmfBYosy4545ckNyTY6plimhTqxrNjfW24YaTaPAMSy7uw-ZsPsMHwASS1UojuSpQpoYAcloJh8pZXposH0C6Aqt2fefyQKDxqY4eTKrrgG8d8K17fAfwfD3lsmvb8a_BOwGv9cAeqgHsriSi7jf2os4CbXqhhSwe_n3WU9ganVUn9cnR-M0juBH-EzLLeL4Lm-2XJT4mG6a1T6Lo_gDituyv
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Human+Activity+Classification+Based+on+Point+Clouds+Measured+by+Millimeter+Wave+MIMO+Radar+With+Deep+Recurrent+Neural+Networks&rft.jtitle=IEEE+sensors+journal&rft.au=Kim%2C+Youngwook&rft.au=Ibrahim+Alnujaim&rft.au=Oh%2C+Daegun&rft.date=2021-06-15&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=1530-437X&rft.eissn=1558-1748&rft.volume=21&rft.issue=12&rft.spage=13522&rft_id=info:doi/10.1109%2FJSEN.2021.3068388&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1530-437X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1530-437X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1530-437X&client=summon