Deep Spatial-Spectral Subspace Clustering for Hyperspectral Image

Hyperspectral image (HSI) clustering is a challenging task due to the complex characteristics in HSI data, such as spatial-spectral structure, high-dimension, and large spectral variability. In this paper, we propose a novel deep spatial-spectral subspace clustering network (DS3 C -Net), which explo...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems for video technology Vol. 31; no. 7; pp. 2686 - 2697
Main Authors Lei, Jianjun, Li, Xinyu, Peng, Bo, Fang, Leyuan, Ling, Nam, Huang, Qingming
Format Journal Article
LanguageEnglish
Published New York IEEE 01.07.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Hyperspectral image (HSI) clustering is a challenging task due to the complex characteristics in HSI data, such as spatial-spectral structure, high-dimension, and large spectral variability. In this paper, we propose a novel deep spatial-spectral subspace clustering network (DS3 C -Net), which explores spatial-spectral information via the multi-scale auto-encoder and collaborative constraint. Considering the structure correlations of HSI, the multi-scale auto-encoder is first designed to extract spatial-spectral features with different-scale pixel blocks which are selected as the inputs. Then, the collaborative constrained self-expressive layers are introduced between the encoder and decoder, to capture the self-expressive subspace structures. By designing a self-expressiveness similarity constraint, the proposed network is trained collaboratively, and the affinity matrices of the feature representation are learned in an end-to-end manner. Based on the affinity matrices, the spectral clustering algorithm is utilized to obtain the final HSI clustering result. Experimental results on three widely used hyperspectral image datasets demonstrate that the proposed method outperforms state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2020.3027616