Lightweight Tensorized Neural Networks for Hyperspectral Image Classification

Deep learning methods have demonstrated excellent performance in hyperspectral image (HSI) classification. However, these methods mainly focus on improving the classification accuracy while ignoring their high complexity. By considering that the data formats of both HSIs and network weights can be r...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; p. 1
Main Authors Ma, Tian-Yu, Li, Heng-Chao, Wang, Rui, Du, Qian, Jia, Xiuping, Plaza, Antonio
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep learning methods have demonstrated excellent performance in hyperspectral image (HSI) classification. However, these methods mainly focus on improving the classification accuracy while ignoring their high complexity. By considering that the data formats of both HSIs and network weights can be represented in the form of tensors, we develop a new lightweight tensorized neural network for HSI classification that takes advantage of low-rank tensor decomposition techniques to reduce complexity. Firstly, inspired by tensor train (TT)-based tensorized convolutional layers, a new tensorized 2D convolutional layer based on chain calculation (with better expression ability) is introduced. Based on this innovation, a new lightweight 2D tensorized neural network (2D-TNN) is designed for HSI classification. Furthermore, to better preserve the intrinsic structure of HSI data, a new lightweight 3D tensorized neural network (3D-TNN) is proposed by extending the tensorized 2D convolutional layers to their 3D versions. Quantitative and comparative experiments on three widely used data sets show that the proposed models are able to achieve state-of-the-art performance (with a low number of model parameters) for different training sample sizes, especially for very small training sets.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2022.3225438