Remote Sensing (Jul 2023)

SRTPN: Scale and Rotation Transform Prediction Net for Multimodal Remote Sensing Image Registration

  • Xiangzeng Liu,
  • Xueling Xu,
  • Xiaodong Zhang,
  • Qiguang Miao,
  • Lei Wang,
  • Liang Chang,
  • Ruyi Liu

DOI
https://doi.org/10.3390/rs15143469
Journal volume & issue
Vol. 15, no. 14
p. 3469

Abstract

Read online

How to recover geometric transformations is one of the most challenging issues in image registration. To alleviate the effect of large geometric distortion in multimodal remote sensing image registration, a scale and rotate transform prediction net is proposed in this paper. First, to reduce the scale between the reference and sensed images, the image scale regression module is constructed via CNN feature extraction and FFT correlation, and the scale of sensed image can be recovered roughly. Second, the rotation estimate module is developed for predicting the rotation angles between the reference and the scale-recovered images. Finally, to obtain the accurate registration results, LoFTR is employed to match the geometric-recovered images. The proposed registration network was evaluated on GoogleEarth, HRMS, VIS-NIR and UAV datasets with contrast differences and geometric distortions. The experimental results show that the number of correct matches of our model reached 74.6%, and the RMSE of the registration results achieved 1.236, which is superior to the related methods.

Keywords