IEEE Access (Jan 2020)

Unbiased Image Style Transfer

  • Hyun-Chul Choi

DOI
https://doi.org/10.1109/ACCESS.2020.3034306
Journal volume & issue
Vol. 8
pp. 196600 – 196608

Abstract

Read online

Image style transferring process generates an output image in the target style with a specific strength for a given pair of content and target image. Recently, feed-forward neural networks have been employed in this process to fastly decode a linearly interpolated feature in encoded feature space. However, to date, no studies have been conducted to analyze the effectiveness of this style interpolation method. In this article, we tackle the missing work of the in-depth analysis of style interpolation and propose a new method that is more effective in controlling the strength of the desired style. The existing methods are biased because the training of a network is performed with one-sided data of full style strength. Therefore, such methods do not guarantee the generation of a satisfactory output image in an intermediate style strength. To resolve this problem of a biased network, we propose an unbiased learning technique, which uses unbiased training data and loss to allow a feed-forward network to learn the desired regression of style consistent with a specific interpolation function in encoded feature space. The experimental results verified that our unbiased method achieved a better regression learning between style control parameter and output image style, and more stable style transfer that is insensitive to the weight of style loss without adding complexity in image generating process.

Keywords