GACP: graph neural networks with ARMA filters and a parallel CNN for hyperspectral image classification
In recent years, the use of convolutional neural networks (CNNs) and graph neural networks (GNNs) to identify hyperspectral images (HSIs) has achieved excellent results, and such methods are widely used in agricultural remote sensing, geological exploration, and marine remote sensing. Although many...
Saved in:
Published in | International journal of digital earth Vol. 16; no. 1; pp. 1770 - 1800 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Abingdon
Taylor & Francis
31.12.2023
Taylor & Francis Ltd Taylor & Francis Group |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In recent years, the use of convolutional neural networks (CNNs) and graph neural networks (GNNs) to identify hyperspectral images (HSIs) has achieved excellent results, and such methods are widely used in agricultural remote sensing, geological exploration, and marine remote sensing. Although many generalization classification algorithms are designed for the purpose of learning a small number of samples, there is often a problem of a low utilization rate of position information in the empty spectral domain. Based on this, a GNN with an autoregressive moving average (ARMA)-based smoothing filter samples the node information in the null spectral domain and then captures the spatial information at the pixel level via spatial feature convolution; then, the null spectral domain position information lost by the CNN is located by a coordinate attention (CA) mechanism. Finally, autoregressive, spatial convolution, and CA mechanisms are combined into multiscale features to enhance the learning capacity of the network for tiny samples. Experiments conducted on the widely used Indian Pines (IP) dataset, the Botswana (BS) dataset, Houton 2013 (H2013), and the WHU-Hi-HongHu (WHU) benchmark HSI dataset demonstrate that the proposed GACP technique can perform classification work with good accuracy even with a small number of training examples. |
---|---|
ISSN: | 1753-8947 1753-8955 |
DOI: | 10.1080/17538947.2023.2210310 |