IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)

FDA-FFNet: A Feature-Distance Attention-Based Change Detection Network for Remote Sensing Image

  • Wenguang Peng,
  • Wenzhong Shi,
  • Min Zhang,
  • Lukang Wang

DOI
https://doi.org/10.1109/JSTARS.2023.3344633
Journal volume & issue
Vol. 17
pp. 2224 – 2233

Abstract

Read online

Convolutional neural networks have demonstrated remarkable capability in extracting deep semantic features from images, leading to significant advancements in various image processing tasks. This success has also opened up new possibilities for change detection (CD) in remote sensing applications. But unlike the conventional image recognition tasks, the performance of AI models in CD heavily relies on the method used to fuse the features from two different phases of the image. The existing deep-learning-based methods for CD typically fuse features of bitemporal images using difference or concatenation techniques. However, these approaches often fail tails to prioritize potential change areas adequately and neglect the rich contextual information essential for discerning subtle changes, potentially leading to slower convergence speed and reduced accuracy. To tackle this challenge, we propose a novel feature fusion approach called feature-difference attention-based feature fusion CD network. This method aims to enhance feature fusion by incorporating a feature-difference attention-based feature fusion module, enabling a more focused analysis of change areas. Additionally, a deep-supervised attention module is implemented to leverage the deep surveillance module for cascading refinement of change areas. Furthermore, an atrous spatial pyramid pooling fast is employed to efficiently acquire multiscale object information. The proposed method is evaluated on two publicly available datasets, namely the WHU-CD and LEVIR-CD datasets. Compared with the state-of-the-art CD methods, the proposed method outperforms in all metrics, with an intersection over union of 92.49% and 85.56%, respectively.

Keywords