Mathematics (Aug 2022)

PEGANs: Phased Evolutionary Generative Adversarial Networks with Self-Attention Module

  • Yu Xue,
  • Weinan Tong,
  • Ferrante Neri,
  • Yixia Zhang

DOI
https://doi.org/10.3390/math10152792
Journal volume & issue
Vol. 10, no. 15
p. 2792

Abstract

Read online

Generative adversarial networks have made remarkable achievements in generative tasks. However, instability and mode collapse are still frequent problems. We improve the framework of evolutionary generative adversarial networks (E-GANs), calling it phased evolutionary generative adversarial networks (PEGANs), and adopt a self-attention module to improve upon the disadvantages of convolutional operations. During the training process, the discriminator will play against multiple generators simultaneously, where each generator adopts a different objective function as a mutation operation. Every time after the specified number of training iterations, the generator individuals will be evaluated and the best performing generator offspring will be retained for the next round of evolution. Based on this, the generator can continuously adjust the training strategy during training, and the self-attention module also enables the model to obtain the modeling ability of long-range dependencies. Experiments on two datasets showed that PEGANs improve the training stability and are competitive in generating high-quality samples.

Keywords