IET Image Processing (Nov 2024)

Hiding image with inception transformer

  • Yunyun Dong,
  • Ping Wei,
  • Ruxin Wang,
  • Bingbing Song,
  • Tingchu Wei,
  • Wei Zhou

DOI
https://doi.org/10.1049/ipr2.13225
Journal volume & issue
Vol. 18, no. 13
pp. 3961 – 3975

Abstract

Read online

Abstract Image steganography aims to hide secret data in the cover media for covert communication. Though many deep‐learning‐based image steganography methods have been presented, these approaches suffer from the inefficiency of building long‐distance connections between the cover and secret images, leading to noticeable modification traces and poor steganalysis resistance. To improve the visual imperceptibility of generated stego images, it is essential to establish a global correlation between the cover and secret images. In this way, the secret image can be dispersed throughout the cover image globally. To bridge this gap, a novel image steganography framework called HiiT is proposed, which takes advantage of CNN and Transformer to learn both the local and global pixel correlation in image hiding. Specifically, a new Transformer structure called Inception Transformer is proposed, which incorporates the Inception Net in the attention‐based Transformer architecture. The Inception Net can learn different scaled image features using multiple convolution kernels, while the attention mechanism can learn the global pixel correlation. By this, the proposed Inception Transformer learns the long‐distance pixel dependency between the cover and secret images. Furthermore, we propose a ‘Skip Connection’ mechanism in the proposed Inception Transformer, which merges the low‐level visual features and high‐level semantic features and achieves better model performance. In detail, The HiiT generates higher‐quality stego images with 45.46 PSNR and 0.9915 SSIM. Besides, accurately restored secret images achieve 47.27 PSNR and 0.9952 SSIM. Extensive experimental results show the proposed HiiT significantly improves the image‐hiding performance compared with state‐of‐the‐art methods.

Keywords