Remote Sensing (Sep 2024)

PGNN-Net: Parallel Graph Neural Networks for Hyperspectral Image Classification Using Multiple Spatial-Spectral Features

  • Ningbo Guo,
  • Mingyong Jiang,
  • Decheng Wang,
  • Yutong Jia,
  • Kaitao Li,
  • Yanan Zhang,
  • Mingdong Wang,
  • Jiancheng Luo

DOI
https://doi.org/10.3390/rs16183531
Journal volume & issue
Vol. 16, no. 18
p. 3531

Abstract

Read online

Hyperspectral image (HSI) shows great potential for application in remote sensing due to its rich spectral information and fine spatial resolution. However, the high dimensionality, nonlinearity, and complex relationship between spectral and spatial features of HSI pose challenges to its accurate classification. Traditional convolutional neural network (CNN)-based methods suffer from detail loss in feature extraction; Transformer-based methods rely too much on the quantity and quality of HSI; and graph neural network (GNN)-based methods provide a new impetus for HSI classification by virtue of their excellent ability to handle irregular data. To address these challenges and take advantage of GNN, we propose a network of parallel GNNs called PGNN-Net. The network first extracts the key spatial-spectral features of HSI using principal component analysis, followed by preprocessing to obtain two primary features and a normalized adjacency matrix. Then, a parallel architecture is constructed using improved GCN and ChebNet to extract local and global spatial-spectral features, respectively. Finally, the discriminative features obtained through the fusion strategy are input into the classifier to obtain the classification results. In addition, to alleviate the over-fitting problem, the label smoothing technique is embedded in the cross-entropy loss function. The experimental results show that the average overall accuracy obtained by our method on Indian Pines, Kennedy Space Center, Pavia University Scene, and Botswana reaches 97.35%, 99.40%, 99.64%, and 98.46%, respectively, which are better compared to some state-of-the-art methods.

Keywords