IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)
Dual Attention-Based Global-Local Feature Extraction Network for Unsupervised Change Detection in PolSAR Images
Abstract
Due to the interference of multiplicative speckles, it is challenging to accurately detect changes in polarimetric synthetic aperture radar (PolSAR) images. Convolutional neural network has been proven to learn rich local features from PolSAR data. However, convolution kernels with limited receptive fields have difficulty in exploring global information. Here, a dual attention-based global-local feature extraction network (DA-GLN) is developed for unsupervised PolSAR image change detection (CD). First, we use fuzzy C-means clustering on the enhanced Shannon entropy difference image to automatically generate pseudolabeled samples required for unsupervised CD. Subsequently, our DA-GLN utilizes a deep residual shrinkage network that incorporates channel attention mechanisms and soft-thresholding to weaken the influence of speckle noise and capture local features. Meanwhile, a pooling-based vision transformer is adopted in DA-GLN to extract global features, which introduces pooling layers to complete self-attention spatial information interaction with higher efficiency than the visual transformer. Furthermore, a global-local constraint feature fusion strategy is designed to effectively fuse local and global features. Finally, we employ a feature constraint-focal loss function including feature constraint loss and focal loss as the objective function of DA-GLN. Specifically, the feature constraint loss function is constructed to eliminate feature redundancy and fully exploit the complementarity between features, while the focal loss function is introduced to balance the impact of the inequality between changed and unchanged samples on the network. Numerical experiments on five real spaceborne PolSAR datasets demonstrate that our DA-GLN is more competitive than other state-of-the-art methods.
Keywords