IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2022)

Feature Matching and Position Matching Between Optical and SAR With Local Deep Feature Descriptor

  • Yun Liao,
  • Yide Di,
  • Hao Zhou,
  • Anran Li,
  • Junhui Liu,
  • Mingyu Lu,
  • Qing Duan

DOI
https://doi.org/10.1109/JSTARS.2021.3134676
Journal volume & issue
Vol. 15
pp. 448 – 462

Abstract

Read online

Image matching between the optical and synthetic aperture radar (SAR) is one of the most fundamental problems for earth observation. In recent years, many researchers have used hand-made descriptors with their expertise to find matches between optical and SAR images. However, due to the large nonlinear radiation difference between optical images and SAR images, the image matching becomes very difficult. To deal with the problems, the article proposes an efficient feature matching and position matching algorithm (MatchosNet) based on local deep feature descriptor. First, A new dataset is presented by collecting a large number of corresponding SAR images and optical images. Then a deep convolutional network with dense blocks and cross stage partial networks is designed to generate deep feature descriptors. Next, the hard L2 loss function and ARCpatch loss function are designed to improve matching effect. In addition, on the basis of feature matching, the two-dimensional (2-D) Gaussian function voting algorithm is designed to further match the position of optical images and SAR images of different sizes. Finally, a large number of quantitative experiments show that MatchosNet has a excellent matching effect in feature matching and position matching. The code will be released at: https://github.com/LiaoYun0x0/Feature-Matching-and-Position-Matching-between-Optical-and-SAR.

Keywords