Entropy (Dec 2020)

Improving Deep Interactive Evolution with a Style-Based Generator for Artistic Expression and Creative Exploration

  • Carlos Tejeda-Ocampo,
  • Armando López-Cuevas,
  • Hugo Terashima-Marin

DOI
https://doi.org/10.3390/e23010011
Journal volume & issue
Vol. 23, no. 1
p. 11

Abstract

Read online

Deep interactive evolution (DeepIE) combines the capacity of interactive evolutionary computation (IEC) to capture a user’s preference with the domain-specific robustness of a trained generative adversarial network (GAN) generator, allowing the user to control the GAN output through evolutionary exploration of the latent space. However, the traditional GAN latent space presents feature entanglement, which limits the practicability of possible applications of DeepIE. In this paper, we implement DeepIE within a style-based generator from a StyleGAN model trained on the WikiArt dataset and propose StyleIE, a variation of DeepIE that takes advantage of the secondary disentangled latent space in the style-based generator. We performed two AB/BA crossover user tests that compared the performance of DeepIE against StyleIE for art generation. Self-rated evaluations of the performance were collected through a questionnaire. Findings from the tests suggest that StyleIE and DeepIE perform equally in tasks with open-ended goals with relaxed constraints, but StyleIE performs better in close-ended and more constrained tasks.

Keywords