Human Activity Classification Based on Point Clouds Measured by Millimeter Wave MIMO Radar With Deep Recurrent Neural Networks

We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar system that has a very high angular azimuth and elevation resolution, scatterers from the body can be localized. When precisely represented,...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 21; no. 12; pp. 13522 - 13529
Main Authors Kim, Youngwook, Alnujaim, Ibrahim, Oh, Daegun
Format Journal Article
LanguageEnglish
Published New York IEEE 15.06.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We investigate the feasibility of classifying human activities measured by a MIMO radar in the form of a point cloud. If a human subject is measured by a radar system that has a very high angular azimuth and elevation resolution, scatterers from the body can be localized. When precisely represented, individual points form a point cloud whose shape resembles that of the human subject. As the subject engages in various activities, the shapes of the point clouds change accordingly. We propose to classify human activities through recognition of point cloud variations. To construct a dataset, we used an FMCW MIMO radar to measure 19 human subjects performing 7 activities. The radar had 12 TXs and 16 RXs, producing a <inline-formula> <tex-math notation="LaTeX">33\times 31 </tex-math></inline-formula> virtual array with approximately 3.5 degrees of angular resolution in azimuth and elevation. To classify human activities, we used a deep recurrent neural network (DRNN) with a two-dimensional convolutional network. The convolutional filters captured point clouds' features at time instance for sequential input into the DRNN, which recognized time-varying signatures, producing a classification accuracy exceeding 97%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2021.3068388