Applied Sciences (Apr 2022)

Use of a DNN-Based Image Translator with Edge Enhancement Technique to Estimate Correspondence between SAR and Optical Images

  • Hisatoshi Toriya,
  • Ashraf Dewan,
  • Hajime Ikeda,
  • Narihiro Owada,
  • Mahdi Saadat,
  • Fumiaki Inagaki,
  • Youhei Kawamura,
  • Itaru Kitahara

DOI
https://doi.org/10.3390/app12094159
Journal volume & issue
Vol. 12, no. 9
p. 4159

Abstract

Read online

In this paper, the local correspondence between synthetic aperture radar (SAR) images and optical images is proposed using an image feature-based keypoint-matching algorithm. To achieve accurate matching, common image features were obtained at the corresponding locations. Since the appearance of SAR and optical images is different, it was difficult to find similar features to account for geometric corrections. In this work, an image translator, which was built with a DNN (deep neural network) and trained by conditional generative adversarial networks (cGANs) with edge enhancement, was employed to find the corresponding locations between SAR and optical images. When using conventional cGANs, many blurs appear in the translated images and they degrade keypoint-matching accuracy. Therefore, a novel method applying an edge enhancement filter in the cGANs structure was proposed to find the corresponding points between SAR and optical images to accurately register images from different sensors. The results suggested that the proposed method could accurately estimate the corresponding points between SAR and optical images.

Keywords