Bilinear Mixing Model-Based Spectral Decomposition Deep Neural Network for Hyperspectral Target Detection
Hyperspectral image (HSI) target detection is an advanced technology within the remote sensing society, widely applied in both civilian and military domains. However, subpixel target detection poses a significantly challenging problem in HSI processing. Existing methods for HSI target detection most...
Saved in:
Published in | IEEE transactions on geoscience and remote sensing Vol. 63; pp. 1 - 17 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
IEEE
2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Hyperspectral image (HSI) target detection is an advanced technology within the remote sensing society, widely applied in both civilian and military domains. However, subpixel target detection poses a significantly challenging problem in HSI processing. Existing methods for HSI target detection mostly rely on constructing detectors based on a linear mixing model (LMM). Nevertheless, due to multiple scattering, LMM fails to capture the nonlinear features and extensive interactions among various materials commonly present in real HSIs, potentially resulting in poor performance. To address this issue, this work proposes a novel method for hyperspectral target detection (HTD) based on a bilinear mixing model (BMM) in HSI. Specifically, we designed a spectral decomposition network to separate the background and target. To further extract and combine spectral and spatial information of the image for more accurate learning of target and background distributions, a deep network model based on a multiscale feature extraction module is proposed. Finally, to enhance the discrimination between targets and backgrounds, a customized CEM loss is formulated by minimizing the response of the background and maximizing the response of the target. Experimental comparisons on six real HSI datasets validate the superiority of the proposed detector. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2025.3590079 |