IEEE Access (Jan 2024)

Efficient Text Style Transfer Through Robust Masked Language Model and Iterative Inference

  • Osama Subhani Khan,
  • Naima Iltaf,
  • Usman Zia,
  • Rabia Latif,
  • Nor Shahida Mohd Jamail

DOI
https://doi.org/10.1109/ACCESS.2024.3501320
Journal volume & issue
Vol. 12
pp. 182353 – 182373

Abstract

Read online

Emergence of Large Language Models (LLMs) have prompted researchers to exploit the abilities of such models for text style transfer (TST). However, these models are prone to hallucinations and suffer from problems of manually crafting prompts and high computation requirements. The purpose of TST is to edit text sequences so that their style is changed without hindering the meaning of their content. Owing to the scarcity of parallel data, existing approaches rely on various strategies to identify and replace style attributes or to edit a given sequence as a whole. Successful style transfer should be fluent and reflect original content. To address these challenges, we propose a novel technique leveraging explanations of prompt-free few-shot contrastive learning based lightweight classifier. First, we create a style-independent corpus of target style sequences by masking out style attributes and train a generator with masked language modeling objective that learns to predict target style tokens. Then, we apply an iterative mechanism to mask source style sequences and predict target style attributes until style is transferred gauged by a pre-trained evaluator model. We conduct experiments on two real world widely used product reviews sentiment datasets on both polarities, i.e., positive to negative and negative to positive. Comparison with various prompt-based as well as unsupervised learning based methods demonstrate state-of-the-art performance of our approach.

Keywords