IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)

AANet: Adaptive Attention Networks for Semantic Segmentation of High-Resolution Remote Sensing Imagery

  • Yan Chen,
  • Qianchuan Zhang,
  • Xiaofeng Wang,
  • Quan Dong,
  • Menglei Kang,
  • Wenxiang Jiang,
  • Mengyuan Wang,
  • Lixiang Xu,
  • Chen Zhang

DOI
https://doi.org/10.1109/JSTARS.2024.3443283
Journal volume & issue
Vol. 17
pp. 14640 – 14655

Abstract

Read online

Contextual information can effectively aid deep-learning models in extracting interclass and intraclass difference features in remote sensing images. This article presents a novel approach called the adaptive attention network (AANet) for semantic segmentation in high-resolution remote sensing images. The proposed AANet aims to enhance the segmentation performance while minimizing the network's computational and parametric aspects. Furthermore, the AANet is designed to facilitate real-time segmentation. The AANet involves the construction of three distinct modules, namely the multiscale channel attention module (MCAM), the multidimensional spatial attention module (MSAM), and the contextual information adaptive fusion module (CIAFM). MCAM enhances a multiscale approach to effectively capture contextual information from neighboring channels and category information. MSAM is designed to extract and combine detailed information from each dimension of the spatial domain. CIAFM focuses on the complementary nature of channel and spatial context information and the correlation between pixels and categories. The methodology employed in this article involved conducting experiments on the ISPRS Vaihingen, ISPRS Potsdam, and multiobject coastal supervision semantic segmentation dataset (MO-CSSSD) datasets alongside a comparative analysis with conventional semantic segmentation models. The results of the article indicate that our approach demonstrates exceptional performance on the ISPRS Vaihingen dataset, ISPRS Potsdam dataset, and MO-CSSSD dataset, achieving mean intersection over union scores of 83.17%, 85.67%, and 89.68%, respectively.

Keywords