IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)
Bitemporal Attention Sharing Network for Remote Sensing Image Change Detection
Abstract
With the advancement of remote sensing image technology, the availability of very high-resolution image data has brought new challenges to change detection (CD). Currently, deep learning-based CD methods commonly employ bitemporal interaction networks using convolutional neural networks or transformers. Yet, these models overly emphasize object accuracy, leading to a significant increase in computational costs with limited performance gains. In addition, the current bitemporal interaction mechanisms are simplistic, failing to adequately account for spatial positions and scale variations of different objects, resulting in an inaccurate modeling of dynamic feature changes between images. To address these issues, a bitemporal attention sharing network is proposed, which tackles the problems effectively by making bitemporal and multiscale attention sharing the primary mode of feature interaction. Specifically, the proposed bitemporal attention sharing module leverages pairs of features preliminarily encoded by a backbone to construct shared global features, directing attention to target changes. Then, through cross-scale attention guidance and weighted fusion, it achieves attention sharing of multiscale features, eliminating the need for overrelying on deep convolutional layers for feature extraction. Experiments on three public datasets demonstrate that, in comparison to several state-of-the-art methods, our model achieves superior performance with low computational cost.
Keywords