Remote Sensing (Nov 2023)

A Full-Scale Connected CNN–Transformer Network for Remote Sensing Image Change Detection

  • Min Chen,
  • Qiangjiang Zhang,
  • Xuming Ge,
  • Bo Xu,
  • Han Hu,
  • Qing Zhu,
  • Xin Zhang

DOI
https://doi.org/10.3390/rs15225383
Journal volume & issue
Vol. 15, no. 22
p. 5383

Abstract

Read online

Recent studies have introduced transformer modules into convolutional neural networks (CNNs) to solve the inherent limitations of CNNs in global modeling and have achieved impressive performance. However, some challenges have yet to be addressed: first, networks with simple connections between the CNN and transformer perform poorly in small change areas; second, networks that only use transformer structures are prone to attaining coarse detection results and excessively generalizing feature boundaries. In addition, the methods of fusing the CNN and transformer have the issue of a unilateral flow of feature information and inter-scale communication, leading to a loss of change information across different scales. To mitigate these problems, this study proposes a full-scale connected CNN–Transformer network, which incorporates the Siamese structure, Unet3+, and transformer structure, used for change detection in remote sensing images, namely SUT. A progressive attention module (PAM) is adopted in SUT to deeply integrate the features extracted from both the CNN and the transformer, resulting in improved global modeling, small target detection capacities, and clearer feature boundaries. Furthermore, SUT adopts a full-scale skip connection to realize multi-directional information flow from the encoder to decoder, enhancing the ability to extract multi-scale features. Experimental results demonstrate that the method we designed performs best on the CDD, LEVIR-CD, and WHU-CD datasets with its concise structure. In particular, based on the WHU-CD dataset, SUT upgrades the F1-score by more than 4% and the intersection over union (IOU) by more than 7% compared with the second-best method.

Keywords