Polarimetric HRRP Recognition Using Vision Transformer with Polarimetric Preprocessing and Attention Loss

Polarimetric High-Resolution Range Profile (HRRP) holds significant potential in Radar Automatic Target Recognition (RATR) due to its ability to provide detailed polarimetric and spatial information. In recent years, deep learning methods have been extensively applied in RATR based on polarimetric H...

Full description

Saved in:
Bibliographic Details
Published inIEEE International Geoscience and Remote Sensing Symposium proceedings pp. 10838 - 10842
Main Authors Gao, Fan, Ren, Dawei, Yin, Junjun, Yang, Jian
Format Conference Proceeding
LanguageEnglish
Published IEEE 07.07.2024
Subjects
Online AccessGet full text
ISSN2153-7003
DOI10.1109/IGARSS53475.2024.10640945

Cover

Loading…
More Information
Summary:Polarimetric High-Resolution Range Profile (HRRP) holds significant potential in Radar Automatic Target Recognition (RATR) due to its ability to provide detailed polarimetric and spatial information. In recent years, deep learning methods have been extensively applied in RATR based on polarimetric HRRP. However, these methods often focus solely on local or temporal features, thereby not fully utilizing the spatial information. Additionally, valuable polarimetric information is frequently overlooked. Moreover, most of these methods do not distinguish between target and noise areas, resulting in a lack of emphasis on crucial range cells. This paper introduces a method for polarimetric HRRP recognition based on the Vision Transformer (ViT), which effectively extracts both local and temporal features. Our approach incorporates a polarimetric preprocessing step in which manual features and Convolutional Neural Networks (CNNs) are combined to enhance the extraction of polarimetric features. To direct the network's focus toward significant range cells, we design a novel attention loss. Experimental results demonstrate that our proposed method improves recognition accuracy and maintains robustness in noisy environments.
ISSN:2153-7003
DOI:10.1109/IGARSS53475.2024.10640945