Heliyon (Mar 2024)

Extending user control for image stylization using hierarchical style transfer networks

  • Sunder Ali Khowaja,
  • Sultan Almakdi,
  • Muhammad Ali Memon,
  • Parus Khuwaja,
  • Adel Sulaiman,
  • Ali Alqahtani,
  • Asadullah Shaikh,
  • Abdullah Alghamdi

Journal volume & issue
Vol. 10, no. 5
p. e27012

Abstract

Read online

The field of neural style transfer refers to the re-rendering of content image while fusing the features of a style image. The recent studies either focus on multiple style transfer or arbitrary style transfer while using perceptual and fixpoint content losses in their respective network architectures. The aforementioned losses provide notable stylization results but lack the liberty of style control to the user. Consequently, the stylization results also compromise the preservation of details with respect to the content image. This work proposes the hierarchical style transfer network (HSTN) for the image stylization task that could provide the user with the liberty to control the degree of incurred style via denoising parameter. The HSTN incorporates the proposed fixpoint control loss that preserves details from the content image and the addition of denoising CNN network (DnCNN) and denoising loss for allowing the user to control the level of stylization. The encoder-decoder block, the DnCNN block, and the loss network block make the basic building blocks of HSTN. Extensive experiments have been carried out, and the results are compared with existing works to demonstrate the effectiveness of HSTN. The subjective user evaluation shows that the HSTN's stylization represents the best fusion of style and generates unique stylization results while preserving the content image details, which is evident by acquiring 12% better results than the second-best performing method. It has also been observed that the proposed work is amongst the studies that achieve the best trade-off regarding content and style classification scores, i.e. 37.64% and 60.27%, respectively.

Keywords