IEEE Access (Jan 2024)

A Character Based Steganography Using Masked Language Modeling

  • Emir Ozturk,
  • Andac Sahin Mesut,
  • Ozlem Aydin Fidan

DOI
https://doi.org/10.1109/ACCESS.2024.3354710
Journal volume & issue
Vol. 12
pp. 14248 – 14259

Abstract

Read online

In this study, a steganography method based on BERT transformer model is proposed for hiding text data in cover text. The aim is to hide information by replacing specific words within the text using BERT’s masked language modeling (MLM) feature. In this study, two models, fine-tuned for English and Turkish, are utilized to perform steganography on texts belonging to these languages. Furthermore, the proposed method can work with any transformer model that supports masked language modeling. While traditionally the hidden information in text is often limited, the proposed method allows for a significant amount of data to be hidden in the text without distorting its meaning. In this study, the proposed method is tested by hiding stego texts of varying lengths in cover text of different lengths in two different language scenarios. The test results are analyzed in terms of perplexity, KL divergence and semantic similarity. Upon examining the results, the proposed method has achieved the best results compared to other methods found in the literature, with KL divergence of 7.93 and semantic similarity of 0.99. It can be observed that the proposed method has low detectability and demonstrates success in the data hiding process.

Keywords