IEEE Access (Jan 2024)

Frequency-Aware Axial-ShiftedNet in Generative Adversarial Networks for Visible-to-Infrared Image Translation

  • Hsi-Ju Lin,
  • Wei-Yuan Cheng,
  • Duan-Yu Chen

DOI
https://doi.org/10.1109/ACCESS.2024.3478356
Journal volume & issue
Vol. 12
pp. 151432 – 151443

Abstract

Read online

Infrared imagery is indispensable for capturing temperature data by detecting infrared radiation, particularly in challenging environments characterized by low-light conditions where visual perception is compromised. As a result, there has been considerable interest in the conversion of visible images into their infrared counterparts. In this research, we present the Freq-ShiftedNet model, which employs an adversarial generative network approach for training. By harnessing the power of the Haar wavelet transform, we adeptly preserve frequency information, directing low-frequency features to the Decoder and high-frequency features to the Encoder. Analysis of the KAIST dataset demonstrates that our model outperforms InfraGAN, achieving a Structural Similarity (SSIM) score of 0.825, marking a 5.4% improvement, and a Learned Perceptual Image Patch Similarity (LPIPS) score of 0.228, indicating a 41.3% decrease. Similarly, using the VEDAI dataset, Freq-ShiftedNet surpasses InfraGAN with an SSIM score of 0.938, representing a 6.6% improvement. These results highlight the effectiveness of our proposed generator, the successful integration of wavelet features into the Freq-ShiftedNet model, and its suitability for real-world applications.

Keywords