Graphical Models (Oct 2023)

Neural style transfer for 3D meshes

  • Hongyuan Kang,
  • Xiao Dong,
  • Juan Cao,
  • Zhonggui Chen

Journal volume & issue
Vol. 129
p. 101198

Abstract

Read online

Style transfer is a popular research topic in the field of computer vision. In 3D stylization, a mesh model is deformed to achieve a specific geometric style. We explore a general neural style transfer framework for 3D meshes that can transfer multiple geometric styles from other meshes to the current mesh. Our stylization network is based on a pre-trained MeshNet model, from which content representation and Gram-based style representation are extracted. By constraining the similarity in content and style representation between the generated mesh and two different meshes, our network can generate a deformed mesh with a specific style while maintaining the content of the original mesh. Experiments verify the robustness of the proposed network and show the effectiveness of stylizing multiple models with one dedicated style mesh. We also conduct ablation experiments to analyze the effectiveness of our network.

Keywords