IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2023)

Self-Supervised Feature Representation for SAR Image Target Classification Using Contrastive Learning

  • Hao Pei,
  • Mingjie Su,
  • Gang Xu,
  • Mengdao Xing,
  • Wei Hong

DOI
https://doi.org/10.1109/JSTARS.2023.3321769
Journal volume & issue
Vol. 16
pp. 9246 – 9258

Abstract

Read online

Nowadays, the developed deep neural networks (DNNs) have been widely applied to synthetic aperture radar (SAR) image interpretation, such as target classification and recognition, which can automatically learn high-level semantic features in data-driven and task-driven manners. For the supervised learning methods, abundant labeled samples are required to avoid the over-fitting of designed networks, which is usually difficult for SAR image applications. To address these issues, a novel two-stage algorithm based on contrastive learning (CL) is proposed for SAR image target classification. In the pretraining stage, to extract self-supervised representations (SSRs) from an unlabeled train set, a convolutional neural network (CNN)-based encoder is first pretrained using a contrasting strategy. This encoder can convert SAR images into a discriminative embedding space. Meanwhile, the optimal encoder can be determined using a linear evaluation protocol, which can indirectly confirm the transferability of prelearned SSRs to downstream tasks. Therefore, in the fine-tuning stage, a SAR target classifier can be adequately trained using a few labeled SSRs in a supervised manner, which benefits from the powerful pretrained encoder. Numerical experiments are carried out on the shared MSTAR dataset to demonstrate that the model based on the proposed self-supervised feature learning algorithm is superior to the conventional supervised methods under labeled data constraints. In addition, knowledge transfer experiments are also conducted on the openSARship dataset, showing that the encoder pretrained from the MSTAR dataset can support the classifier training with high efficiency and precision. These results demonstrate the excellent training convergence and classification performance of the proposed algorithm.

Keywords