Journal of Hebei University of Science and Technology (Jun 2024)

Cycle consistent style transfer based on style-transition attention

  • Rui[DK]’er ZHANG,
  • Xiaohang BIAN,
  • Siyuan LIU,
  • Bin LIU,
  • Jianwu LI,
  • Jun LUO,
  • Mingyue QI

DOI
https://doi.org/10.7535/hbkd.2024yx03012
Journal volume & issue
Vol. 45, no. 3
pp. 328 – 340

Abstract

Read online

In order to solve the problem that the existing art style transfer methods can not maintain high-quality image content and transform style patterns at the same time, a novel style-transition attention network (STANet) was introduced, which consists of two key parts: one is the asymmetric attention module used to determine the style features of the reference image, and the other is the circular structure used to save the content of the image. Firstly, the two-stream architecture was adopted to encode the style and content images.Secondly, the attention module was seamlessly integrated into the encoder to generate the style attention representation. Finally, the module was put into different convolution stages, making the encoder interleaved, and facilitating the flow of hierarchical information from style to content. In addition, a circular consistency loss was proposed to force the network to retain the content structure and style patterns in a holistic manner. The results show that the encoder is superior to the traditional Shuangliu District architecture, and STANet can be used to exchange the style patterns of two images with any style, resulting in higher quality stylized images, while better preserving their own content. The proposed style conversion loop network with attention to style conversion makes the model stylized images more detailed and achieves good performance in generalization to any styles.

Keywords